as I have around 610 URL's which are needed to be used to create a sgement of it . can you please guide
Solved! Go to Solution.
Views
Replies
Total Likes
If you don't have your URLs in an eVar, you won't be able to build a segment for them either.... the URL that is captured in page_url (or the g param on your server calls) is only available through Raw Data Feeds or Data Warehouse exports... it will not surface in Workspaces.
Unless someone has already created a custom dimension, you will not be able to create a segment for this....
IF you have the URL in an available dimension, but don't have the ability to create classifications, there is no easy way to add 610 URLs into a segment... You can try the following:
1. Use excel to concatenate all the URLs into a single string with each URL being separated with a space, and try adding them to:
eVarX (url) contains any of [add the string here]
But that might not allow for that much text (I suspect it won't allow 600+ urls strings)
2. Create a series of "OR" statements using like above, but break the list of sites per string to something more manageable:
eVarX (url) contains any of [add the string here - group 1]
OR
eVarX (url) contains any of [add the string here - group 2]
OR
eVarX (url) contains any of [add the string here - group 3]
....
3. Or create a url check for each of the 600+ URLs separately using OR
eVarX (url) contains [URL 1]
OR
eVarX (url) contains [URL 2]
OR
eVarX (url) contains [URL 3]
Is there any other dimensions being collected on those pages that would allow you to pull them together more efficiently than looking specifically at URL??
If your suite it large and has a lot of URLs you run the risk of those URLs exceeding the Workspace unique value limit, and therefore they will be lost under "(Low Traffic)" bucket, and your segment won't be able to pull anything under that....
Views
Replies
Total Likes
If you don't need "fully" live data (as in you can wait for classification rules to process every 4 - 6 hours... I would almost be tempted to create a classification under your URL eVar (I assume you are tracking the URL in something that you can use in workspace); then you should be able to use the rule importer... basically identifying those 600+ urls with something akin to a "group name".. then you can just create a segment that looks for the classification being your specific group name.
If you have future similar needs, then you can use the same classification, but group the new URLs with a new "group name"
For instance, create a classification called "URL Group" on your eVar
Then import rules for the 610 URLs that set the classification "URL Group" to something descriptive that describes those URLs (i.e. "election articles 2023" or "super sale products feb 2023" - I don't know what your URLs are, so I don't know what name you should use)
Then later, you can create groups for "election articles 2024" or "super sale products nov 2023", etc (basic ways to identify the content, leaving room for future uses - maybe date specific names aren't important, depending on what you are doing, but stuff that might have similar names that potentially have recurrence could)
Also, when you say "610 URLs"... are those repeated URLs due to multiple campaign codes? Cause you can simplify your life by using regex rules to find just the base URL (that will take any variant of the URL)
Once your classifications are processed, you can simply create a segment looking at
"URL Group (eVarX)" equals group name
Views
Replies
Total Likes
@Jennifer_Dungan thanks for the Quick reply
although my problem is much more complex , as neither have admin access to workspace tool nor knowledge of JScript to create eVar
I just want a method to add in URL's of target companies in my workspace so that I can use segment to analyse my data
if we have template of excel where I can upload such URL and methodology to add in URL ,create segment that would be great
otherwise I have to add them manually in the screenshot attached area
Views
Replies
Total Likes
If you don't have your URLs in an eVar, you won't be able to build a segment for them either.... the URL that is captured in page_url (or the g param on your server calls) is only available through Raw Data Feeds or Data Warehouse exports... it will not surface in Workspaces.
Unless someone has already created a custom dimension, you will not be able to create a segment for this....
IF you have the URL in an available dimension, but don't have the ability to create classifications, there is no easy way to add 610 URLs into a segment... You can try the following:
1. Use excel to concatenate all the URLs into a single string with each URL being separated with a space, and try adding them to:
eVarX (url) contains any of [add the string here]
But that might not allow for that much text (I suspect it won't allow 600+ urls strings)
2. Create a series of "OR" statements using like above, but break the list of sites per string to something more manageable:
eVarX (url) contains any of [add the string here - group 1]
OR
eVarX (url) contains any of [add the string here - group 2]
OR
eVarX (url) contains any of [add the string here - group 3]
....
3. Or create a url check for each of the 600+ URLs separately using OR
eVarX (url) contains [URL 1]
OR
eVarX (url) contains [URL 2]
OR
eVarX (url) contains [URL 3]
Is there any other dimensions being collected on those pages that would allow you to pull them together more efficiently than looking specifically at URL??
If your suite it large and has a lot of URLs you run the risk of those URLs exceeding the Workspace unique value limit, and therefore they will be lost under "(Low Traffic)" bucket, and your segment won't be able to pull anything under that....
Views
Replies
Total Likes
Views
Replies
Total Likes
You're welcome... I wish I had a better answer for you!
Like I said, for 600+ urls, classifications would make your life a lot easier... maybe you can ask your admin to help support this?
Views
Likes
Replies
Views
Likes
Replies