Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn More

View all

Sign in to view all badges

Data warehouse error sending report via to FTP server

Avatar

Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile

Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
Level 1

11-04-2021

Hi, I'm trying to send a basic report using Data Warehouse to a FTP server (basically is a server in Azure) where it receives files from other sources. I included in the firewall the list of IPs (https://experienceleague.adobe.com/docs/analytics/technotes/ip-addresses.html?lang=en#all-adobe-anal...) but the report seem to fail. Error description: Request failed to send. Please check send parameters and destination connectivity.

 

I've ensured that the login parameters are right using an ftp client and ensuring that I've at least 2GB of free space. The parameters I used are:

host: ip of the host server, lets say 178.178.178.178

port: 22

directory: /load

username: user1

password: pwd1

 

The report has failed and is constantly sending me emails with the notification of the error, how can I stop this? 

The report I created is scheduled for being send just once (send immediately). Can I erase the report so it doesn't appear in the request Manager tab? I don't see any remove button or unsuscribe option anywhere

 

Regards

 

 

View Entire Topic

Avatar

Avatar
Affirm 10
Level 2
amgup
Level 2

Likes

15 likes

Total Posts

71 posts

Correct Reply

9 solutions
Top badges earned
Affirm 10
Boost 5
Boost 3
Boost 10
Boost 1
View profile

Avatar
Affirm 10
Level 2
amgup
Level 2

Likes

15 likes

Total Posts

71 posts

Correct Reply

9 solutions
Top badges earned
Affirm 10
Boost 5
Boost 3
Boost 10
Boost 1
View profile
amgup
Level 2

11-04-2021

Hi @myradio_digital  . Have you included the sftp:// before the IP address ? If not than can you please write the hostname as sftp://178.178.178.178 . Since you are using SFTP server so it is mandatory to write sftp:// before the IP address.

myradio_digital

Thank you for your quick reply, I've tested the request also writing sftp:178.178.178.178 but it also fails. If I try to connect with a tool like filezilla or winscp it works perfectly and I can upload files to the machine. Is there any configuration such as using other ports like the 21 to try to connect to the service? Can it be that the path is not correctly set? --> /load  

 

amgup
If the request is still failing than you can contact the Client Care along with the Request ID of Data Warehouse request.
myradio_digital
The ssh key provided should be uploaded in the /etc/ssh/ directory of the machine or somewhere else?
amgup

keys should be present in authorized_keys file (with no extension) which has to be kept in .ssh folder in the root directory of SFTP server.

myradio_digital
Ok, I check it. By the way I'm not sure it the RSA key provided in Data Feeds offers the same level of access to Data Warehouse or it is a complete different thing. Does it apply the same thing to the Azure Blob Storage?
myradio_digital
Hi, as jantzen_belliston-Adobe ‎asked me to provide more information, I'm still having some issues to connect both services (Data Warehouse or Data Feeds) to my SFTP server. What I tried is to upload the keys file to /home/myuser/.ssh/authorized_keys. I've changed the permissions to the file and also the path so that the user is the owner: chown myuser:userg /home/myuser/.ssh/authorized_keys, in the config file (sshd_config) I've changed the following: RSAAuthentication yes PubkeyAuthentication yes PermitRootLogin no AuthorizedKeysFile /home/myuser/.ssh/authorized_keys ChallengeResponseAuthentication no Match Group userg ChrootDirectory /home/myuser X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp -d /.ssh