since ‎06-11-2020
Online
myradio_digital
Level 1
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
Ok, I check it. By the way I'm not sure it the RSA key provided in Data Feeds offers the same level of access to Data Warehouse or it is a complete different thing. Does it apply the same thing to the Azure Blob Storage?

Views

17

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
The ssh key provided should be uploaded in the /etc/ssh/ directory of the machine or somewhere else?

Views

36

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
In case I want to create two or more DWH requests in the future and have these reports stored in the blob storage, do I need to contact the client care team to configure it? Does this operation have any extra expense or limit?

Views

27

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
If there is also the chance to configure a blob storage folder what procedure/settings do I have to follow? I mean on the side of the blob service in order to know if I have to provide ssh keys or does the blob need to be pulic (something else)?

Views

38

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
I'm not using Azure Blob storage, the service is a SFTP that is currently using working and I can upload files to it (so at least I know we can access it with SSH-2). I also want to know why is the service sending notifications about the reports that failed even a few days later and they are not scheduled, they are required to be sent immediately. On the other hand is it possible that Adobe access the SFTP service using the user/password credentials provided or is it mandatory to use a key (Adob...

Views

52

Likes

0

Replies

0
Re: Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
Thank you for your quick reply, I've tested the request also writing sftp:178.178.178.178 but it also fails. If I try to connect with a tool like filezilla or winscp it works perfectly and I can upload files to the machine. Is there any configuration such as using other ports like the 21 to try to connect to the service? Can it be that the path is not correctly set? --> /load

Views

56

Likes

0

Replies

0
Data warehouse error sending report via to FTP server
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
Hi, I'm trying to send a basic report using Data Warehouse to a FTP server (basically is a server in Azure) where it receives files from other sources. I included in the firewall the list of IPs (https://experienceleague.adobe.com/docs/analytics/technotes/ip-addresses.html?lang=en#all-adobe-analytics-ip-address-blocks) but the report seem to fail. Error description: Request failed to send. Please check send parameters and destination connectivity. I've ensured that the login parameters are right...

Views

118

Likes

0

Replies

12
Adobe Analytics API Slow performance when using two machines
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
Hi, I'm new in the communityMy problem is: I've tried to perform two large extractions from different machines over the same VRS. I'm using different project credentials (to be clear lets say I have project1 --> machine 1 and project2--> machine 2). I've tried to launch the two queries over different dimensions at the same time but I experienced by the logs that the performance drops and works dead slow (only 5 requests per minute). I'm using the API 2.0 to extract data using the python API. As ...

Views

32

Likes

0

Replies

0
Re: Adobe Analytics 2.0 API
Avatar

myradio_digital

myradio_digital
- Adobe Analytics
Hi ishans52004352 I've tried to perform from two different machines (with the same credentials) two different queries, but I've noticed some sort of decrease of performance. I'm sure that I'm not reaching the limit of queries (because of the speed of retrieval of the API). Is there any configuration that allows me to run the two different extractions at the highest throughput, so the performance of both extractions is not compromised?

Views

53

Likes

0

Replies

0