Data warehouse error sending report via to FTP server

Avatar

Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

12 posts

Correct reply

0 solutions
View profile

Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

12 posts

Correct reply

0 solutions
View profile
myradio_digital
Level 1

11-04-2021

Hi, I'm trying to send a basic report using Data Warehouse to a FTP server (basically is a server in Azure) where it receives files from other sources. I included in the firewall the list of IPs (https://experienceleague.adobe.com/docs/analytics/technotes/ip-addresses.html?lang=en#all-adobe-anal...) but the report seem to fail. Error description: Request failed to send. Please check send parameters and destination connectivity.

 

I've ensured that the login parameters are right using an ftp client and ensuring that I've at least 2GB of free space. The parameters I used are:

host: ip of the host server, lets say 178.178.178.178

port: 22

directory: /load

username: user1

password: pwd1

 

The report has failed and is constantly sending me emails with the notification of the error, how can I stop this? 

The report I created is scheduled for being send just once (send immediately). Can I erase the report so it doesn't appear in the request Manager tab? I don't see any remove button or unsuscribe option anywhere

 

Regards

 

 

Accepted Solutions (1)

Accepted Solutions (1)

Avatar

Avatar
Affirm 5
Level 2
akt0m3r
Level 2

Likes

8 likes

Total Posts

31 posts

Correct reply

5 solutions
Top badges earned
Affirm 5
Establish
Boost 5
Boost 3
Boost 1
View profile

Avatar
Affirm 5
Level 2
akt0m3r
Level 2

Likes

8 likes

Total Posts

31 posts

Correct reply

5 solutions
Top badges earned
Affirm 5
Establish
Boost 5
Boost 3
Boost 1
View profile
akt0m3r
Level 2

11-04-2021

Hi @myradio_digital,

Could you please clarify if you are using Azure Blob or you have configured the server to work as FTP/SFTP? If it's Azure Blog, you will need to create a ticket with Client Care team to get this done manually. Currently this option is not available via the UI.

 

However, if the server is configures as FTP/SFTP, please note Adobe uses the port 22 for SFTP requests only. If the server don't support secure protocol I would recommend using port 21 instead of 22 for FTP.

In case the server works with SFTP, there is no need to use the password (Adobe don't use Password based authentication for SFTP and require to use SSH key-pair) and you will need to configure your server based on the document: https://experienceleague.adobe.com/docs/analytics/export/ftp-and-sftp/secure-file-transfer-protocol/...

 

Hope this explains and clarifies the doubts!

 

 

myradio_digital

I'm not using Azure Blob storage, the service is a SFTP that is currently using working and I can upload files to it (so at least I know we can access it with SSH-2). I also want to know why is the service sending notifications about the reports that failed even a few days later and they are not scheduled, they are required to be sent immediately.  On the other hand is it possible that Adobe access the SFTP service using the user/password credentials provided or is it mandatory to use a key (Adobe’s authorized_keys file is in the .ssh directory within the root directory of the user you log in with) to access the service. I'm asking this because the idea is to ask for a few reports and I don't know if I have to ask for a different key for each of them or can I operate only with one (Adobe ssh key)?. 

akt0m3r
DWH requests tries to connect to the FTP server few time before it send an error email. If the server is not configured properly as given in the document, it will certainly be failing after some time. At this point I would request you to check if the server is configured as per the document.
myradio_digital
If there is also the chance to configure a blob storage folder what procedure/settings do I have to follow? I mean on the side of the blob service in order to know if I have to provide ssh keys or does the blob need to be pulic (something else)?
akt0m3r
In case of using the Azure Blob, there is no need to install the SSH keys. I would request you to raise a ticket with Client Care team if you want to configure Azure Blob for DWH request at this point.
myradio_digital
In case I want to create two or more DWH requests in the future and have these reports stored in the blob storage, do I need to contact the client care team to configure it? Does this operation have any extra expense or limit?

Answers (1)

Answers (1)

Avatar

Avatar
Boost 25
Level 4
amgup
Level 4

Likes

33 likes

Total Posts

134 posts

Correct reply

20 solutions
Top badges earned
Boost 25
Affirm 10
Boost 5
Boost 3
Boost 10
View profile

Avatar
Boost 25
Level 4
amgup
Level 4

Likes

33 likes

Total Posts

134 posts

Correct reply

20 solutions
Top badges earned
Boost 25
Affirm 10
Boost 5
Boost 3
Boost 10
View profile
amgup
Level 4

11-04-2021

Hi @myradio_digital  . Have you included the sftp:// before the IP address ? If not than can you please write the hostname as sftp://178.178.178.178 . Since you are using SFTP server so it is mandatory to write sftp:// before the IP address.

myradio_digital

Thank you for your quick reply, I've tested the request also writing sftp:178.178.178.178 but it also fails. If I try to connect with a tool like filezilla or winscp it works perfectly and I can upload files to the machine. Is there any configuration such as using other ports like the 21 to try to connect to the service? Can it be that the path is not correctly set? --> /load  

 

amgup
If the request is still failing than you can contact the Client Care along with the Request ID of Data Warehouse request.
myradio_digital
The ssh key provided should be uploaded in the /etc/ssh/ directory of the machine or somewhere else?
amgup

keys should be present in authorized_keys file (with no extension) which has to be kept in .ssh folder in the root directory of SFTP server.

myradio_digital
Ok, I check it. By the way I'm not sure it the RSA key provided in Data Feeds offers the same level of access to Data Warehouse or it is a complete different thing. Does it apply the same thing to the Azure Blob Storage?
myradio_digital
Hi, as jantzen_belliston-Adobe ‎asked me to provide more information, I'm still having some issues to connect both services (Data Warehouse or Data Feeds) to my SFTP server. What I tried is to upload the keys file to /home/myuser/.ssh/authorized_keys. I've changed the permissions to the file and also the path so that the user is the owner: chown myuser:userg /home/myuser/.ssh/authorized_keys, in the config file (sshd_config) I've changed the following: RSAAuthentication yes PubkeyAuthentication yes PermitRootLogin no AuthorizedKeysFile /home/myuser/.ssh/authorized_keys ChallengeResponseAuthentication no Match Group userg ChrootDirectory /home/myuser X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp -d /.ssh