Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn More

View all

Sign in to view all badges

myradio_digital
Community profile myradio_digital Level 1
Job title here
Location here
0 BADGES
Level 1

Level 1

Learn more
Joined the community 06-11-2020 5:42:10 AM
Offline
Top badges earned by myradio_digital
Customize the badges you want to showcase on your profile
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Hi, as jantzen_belliston-Adobe ‎asked me to provide more information, I'm still having some issues to connect both services (Data Warehouse or Data Feeds) to my SFTP server. What I tried is to upload the keys file to /home/myuser/.ssh/authorized_keys. I've changed the permissions to the file and also the path so that the user is the owner: chown myuser:userg /home/myuser/.ssh/authorized_keys, in the config file (sshd_config) I've changed the following: RSAAuthentication yes PubkeyAuthentication ...

Views

38

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Ok, I check it. By the way I'm not sure it the RSA key provided in Data Feeds offers the same level of access to Data Warehouse or it is a complete different thing. Does it apply the same thing to the Azure Blob Storage?

Views

174

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
The ssh key provided should be uploaded in the /etc/ssh/ directory of the machine or somewhere else?

Views

193

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
In case I want to create two or more DWH requests in the future and have these reports stored in the blob storage, do I need to contact the client care team to configure it? Does this operation have any extra expense or limit?

Views

152

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
If there is also the chance to configure a blob storage folder what procedure/settings do I have to follow? I mean on the side of the blob service in order to know if I have to provide ssh keys or does the blob need to be pulic (something else)?

Views

163

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
I'm not using Azure Blob storage, the service is a SFTP that is currently using working and I can upload files to it (so at least I know we can access it with SSH-2). I also want to know why is the service sending notifications about the reports that failed even a few days later and they are not scheduled, they are required to be sent immediately. On the other hand is it possible that Adobe access the SFTP service using the user/password credentials provided or is it mandatory to use a key (Adob...

Views

177

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Thank you for your quick reply, I've tested the request also writing sftp:178.178.178.178 but it also fails. If I try to connect with a tool like filezilla or winscp it works perfectly and I can upload files to the machine. Is there any configuration such as using other ports like the 21 to try to connect to the service? Can it be that the path is not correctly set? --> /load

Views

213

Likes

0

Replies

0
Data warehouse error sending report via to FTP server
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Hi, I'm trying to send a basic report using Data Warehouse to a FTP server (basically is a server in Azure) where it receives files from other sources. I included in the firewall the list of IPs (https://experienceleague.adobe.com/docs/analytics/technotes/ip-addresses.html?lang=en#all-adobe-analytics-ip-address-blocks) but the report seem to fail. Error description: Request failed to send. Please check send parameters and destination connectivity. I've ensured that the login parameters are right...

Views

316

Likes

0

Replies

14
Adobe Analytics API Slow performance when using two machines
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Hi, I'm new in the communityMy problem is: I've tried to perform two large extractions from different machines over the same VRS. I'm using different project credentials (to be clear lets say I have project1 --> machine 1 and project2--> machine 2). I've tried to launch the two queries over different dimensions at the same time but I experienced by the logs that the performance drops and works dead slow (only 5 requests per minute). I'm using the API 2.0 to extract data using the python API. As ...

Views

77

Likes

0

Replies

0
Re: Adobe Analytics 2.0 API
Avatar
Level 1
myradio_digital
Level 1

Likes

0 likes

Total Posts

10 posts

Correct Reply

0 solutions
View profile
myradio_digital
- Adobe Analytics
Hi ishans52004352 I've tried to perform from two different machines (with the same credentials) two different queries, but I've noticed some sort of decrease of performance. I'm sure that I'm not reaching the limit of queries (because of the speed of retrieval of the API). Is there any configuration that allows me to run the two different extractions at the highest throughput, so the performance of both extractions is not compromised?

Views

407

Likes

0

Replies

0