Expand my Community achievements bar.

SOLVED

Data warehouse error sending report via to FTP server

Avatar

Level 2

Hi, I'm trying to send a basic report using Data Warehouse to a FTP server (basically is a server in Azure) where it receives files from other sources. I included in the firewall the list of IPs (https://experienceleague.adobe.com/docs/analytics/technotes/ip-addresses.html?lang=en#all-adobe-anal...) but the report seem to fail. Error description: Request failed to send. Please check send parameters and destination connectivity.

 

I've ensured that the login parameters are right using an ftp client and ensuring that I've at least 2GB of free space. The parameters I used are:

host: ip of the host server, lets say 178.178.178.178

port: 22

directory: /load

username: user1

password: pwd1

 

The report has failed and is constantly sending me emails with the notification of the error, how can I stop this? 

The report I created is scheduled for being send just once (send immediately). Can I erase the report so it doesn't appear in the request Manager tab? I don't see any remove button or unsuscribe option anywhere

 

Regards

 

 

1 Accepted Solution

Avatar

Correct answer by
Employee

Hi @myradio_digital,

Could you please clarify if you are using Azure Blob or you have configured the server to work as FTP/SFTP? If it's Azure Blog, you will need to create a ticket with Client Care team to get this done manually. Currently this option is not available via the UI.

 

However, if the server is configures as FTP/SFTP, please note Adobe uses the port 22 for SFTP requests only. If the server don't support secure protocol I would recommend using port 21 instead of 22 for FTP.

In case the server works with SFTP, there is no need to use the password (Adobe don't use Password based authentication for SFTP and require to use SSH key-pair) and you will need to configure your server based on the document: https://experienceleague.adobe.com/docs/analytics/export/ftp-and-sftp/secure-file-transfer-protocol/...

 

Hope this explains and clarifies the doubts!

 

 

View solution in original post

14 Replies

Avatar

Level 7

Hi @myradio_digital  . Have you included the sftp:// before the IP address ? If not than can you please write the hostname as sftp://178.178.178.178 . Since you are using SFTP server so it is mandatory to write sftp:// before the IP address.

Avatar

Level 2

Thank you for your quick reply, I've tested the request also writing sftp:178.178.178.178 but it also fails. If I try to connect with a tool like filezilla or winscp it works perfectly and I can upload files to the machine. Is there any configuration such as using other ports like the 21 to try to connect to the service? Can it be that the path is not correctly set? --> /load  

 

Avatar

Level 7
If the request is still failing than you can contact the Client Care along with the Request ID of Data Warehouse request.

Avatar

Level 2
The ssh key provided should be uploaded in the /etc/ssh/ directory of the machine or somewhere else?

Avatar

Level 7

keys should be present in authorized_keys file (with no extension) which has to be kept in .ssh folder in the root directory of SFTP server.

Avatar

Level 2
Ok, I check it. By the way I'm not sure it the RSA key provided in Data Feeds offers the same level of access to Data Warehouse or it is a complete different thing. Does it apply the same thing to the Azure Blob Storage?

Avatar

Level 2
Hi, as jantzen_belliston-Adobe ‎asked me to provide more information, I'm still having some issues to connect both services (Data Warehouse or Data Feeds) to my SFTP server. What I tried is to upload the keys file to /home/myuser/.ssh/authorized_keys. I've changed the permissions to the file and also the path so that the user is the owner: chown myuser:userg /home/myuser/.ssh/authorized_keys, in the config file (sshd_config) I've changed the following: RSAAuthentication yes PubkeyAuthentication yes PermitRootLogin no AuthorizedKeysFile /home/myuser/.ssh/authorized_keys ChallengeResponseAuthentication no Match Group userg ChrootDirectory /home/myuser X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp -d /.ssh

Avatar

Correct answer by
Employee

Hi @myradio_digital,

Could you please clarify if you are using Azure Blob or you have configured the server to work as FTP/SFTP? If it's Azure Blog, you will need to create a ticket with Client Care team to get this done manually. Currently this option is not available via the UI.

 

However, if the server is configures as FTP/SFTP, please note Adobe uses the port 22 for SFTP requests only. If the server don't support secure protocol I would recommend using port 21 instead of 22 for FTP.

In case the server works with SFTP, there is no need to use the password (Adobe don't use Password based authentication for SFTP and require to use SSH key-pair) and you will need to configure your server based on the document: https://experienceleague.adobe.com/docs/analytics/export/ftp-and-sftp/secure-file-transfer-protocol/...

 

Hope this explains and clarifies the doubts!

 

 

Avatar

Level 2

I'm not using Azure Blob storage, the service is a SFTP that is currently using working and I can upload files to it (so at least I know we can access it with SSH-2). I also want to know why is the service sending notifications about the reports that failed even a few days later and they are not scheduled, they are required to be sent immediately.  On the other hand is it possible that Adobe access the SFTP service using the user/password credentials provided or is it mandatory to use a key (Adobe’s authorized_keys file is in the .ssh directory within the root directory of the user you log in with) to access the service. I'm asking this because the idea is to ask for a few reports and I don't know if I have to ask for a different key for each of them or can I operate only with one (Adobe ssh key)?. 

Avatar

Employee
DWH requests tries to connect to the FTP server few time before it send an error email. If the server is not configured properly as given in the document, it will certainly be failing after some time. At this point I would request you to check if the server is configured as per the document.

Avatar

Level 2
If there is also the chance to configure a blob storage folder what procedure/settings do I have to follow? I mean on the side of the blob service in order to know if I have to provide ssh keys or does the blob need to be pulic (something else)?

Avatar

Employee
In case of using the Azure Blob, there is no need to install the SSH keys. I would request you to raise a ticket with Client Care team if you want to configure Azure Blob for DWH request at this point.

Avatar

Level 2
In case I want to create two or more DWH requests in the future and have these reports stored in the blob storage, do I need to contact the client care team to configure it? Does this operation have any extra expense or limit?

Avatar

Level 10
Do any of the answers below answer your initial question? If so, can you select one of them as the correct answer? If none of the answers already provided answer your question, can you provide additional information to better help the community solve your question?