Product Updates | Community
Skip to main content
  • 319 Product updates

Sign Up for the Adobe AEM Assets Digital Librarian Beta Exam

  ADLS is starting the next round of beta tests for our Business User Credential Exam program, this time focusing on Experience Manager Assets. We need to have as many people as possible complete the beta exam within the beta period, so we can ensure the exam is accurate and fair. Participants can be the first to earn this new credential and get a ready-for-sharing virtual badge.   ABOUT THE EXAM  The Adobe Qualified: AEM Assets Digital Librarian exam is designed to assess a candidate’s knowledge in organizing assets, managing assets and their metadata, distributing and locating assets in the DAM repository, and using workflows to manage business processes in AEM Assets. The exam is rigorously developed and professionally administered to provide a highly respected Digital Librarian credential.   This is a business user credential: however, candidates of all skill levels are desired for the beta exam, as this will allow us to measure the accuracy of the exam more effectively for our target candidates’ skill level.   WHAT TO EXPECT AS A BETA TESTER Participation is free of cost. You can take the exam anytime between June 28 – July 21, but it must be completed in one sitting. The exam will take about 120 mins. You will be asked to provide feedback on exam questions during the exam. You will receive test results 4 – 5 weeks after the test period ends. Credentials will be awarded to those who have a passing score.   NEXT STEPS Sign up to take the beta exam. Once you sign up, you will get an on-screen acknowledgment that your form has been successfully submitted. Instructions on how to take the beta exam will be sent beginning on June 28th and beyond for any subsequent sign-ups. The earlier you sign up, the more time you will have to complete the exam. Complete the exam between June 28 – July 21. Get your teammates and associates to sign up too!   It is critical to our exam development schedule that as many people take the exam as soon as they can, as we have a minimum number of exams that need to be completed to ensure our test is valid.   Thank you for your help in ensuring the accuracy and validity of this important new credential!   Questions? Contact adlsbeta@adobe.com.   © Adobe. All rights reserved.    

Blueprint Released: Offboarding | Departing User Dashboard

It happens to every system or group admin: one of your users is leaving the company, taking on a new role, or needs to switch groups. How do you find everything they own or are working on in Workfront so you can transfer ownership of their objects or reassign their work? Well, we’ve built a dashboard for this purpose! Released as a blueprint, this dashboard contains 13 reports to help you find, reassign, or otherwise manage the objects and assignments associated with a user departing your purview of Workfront. A few things to note:  The reports on the dashboard have been built to support the needs of a wide audience and it’s likely that you will need to change filters, views, and groupings on some reports to better meet your organization’s needs.  You may not need all the reports, so feel free to remove them.  All reports have tags in their descriptions for searchability: #offboarding Please note that for active project, task, and issue approvals, we’ve found that a report is not the easiest option for reassigning them from a departing user. Instead, log in as the departing user and go to My Updates. From there, choose the Delegate approvals button, pick another user, and this will reassign them all at once.   Finally, we recommend making copies of these reports before editing in case you want to go back to the original reports. This dashboard is now available to install from your blueprints. Learn how to install this dashboard from blueprints Reports Included in the Offboarding | Departing User Dashboard: Departing User Active ProjectsUse this project report to find active projects owned by the departing user by entering their name into the prompt. Departing User Open Tasks Use this task report to find open tasks assigned to the departing user by entering their name into the prompt. Departing User Open Issues Use this issue report to find open issues assigned to the departing user by entering their name into the prompt. Departing User Pending Proof Approvals Use this proof approval report to find approvals assigned to the departing user by entering their name into the prompt. Approvals Delegated to Departing User Use this user delegation report to find project, task, and issue approvals delegated to the departing user (To User) by entering their name into the prompt. If the departing user has approvals delegated to them, log in as the From User and change the delegation to someone else. Departing User-Owned Portfolios Use this portfolio report to find portfolios owned by the departing user by entering their name into the prompt. Departing User-Owned Programs Use this program report to find programs owned by the departing user by entering their name into the prompt. Departing User-Owned Project Templates Use this project template report to find project templates owned by the departing user by entering their name into the prompt. Departing User Assigned in Template Tasks Use this template task report to find template tasks where the departing user is an assignee by entering their name into the prompt. Departing User As Default Assignee on Queue Topic Use this queue topic report to find queue topics where the departing user is the default assignee by entering their name into the prompt. Departing User Reports Running on Their Account Use this report to find reports running on the account of the departing user by entering their name into the prompt. Departing User-Created Reports Use this report to find reports created by the departing user by entering their name into the prompt. Departing User-Created Dashboards Use this dashboard report to find dashboards created by the departing user by entering their name into the prompt. 

Blueprint Released: Value Realization | Review and Approve Dashboard

If you’re a system admin, product owner, champion, or manager of Workfront users, you’ve probably been asked to demonstrate the value Workfront is bringing to your business. We’ve identified five key areas that your business can realize value. Workfront provides the ability to: Centralize work in one solution Manage work processes Review and approve digital work Govern compliance workflows Deliver client-facing services We’ve created a dashboard that helps you identify value realized from the third bullet point and we call it Value Realization | Review and Approve. We’ve thought about all types of approvals in Workfront, including documents/proofs, projects, tasks, and issues. Review and approve features help you streamline the approval process by reviewing assets in one easy-to-use system and eliminate the cost of mistakes. This dashboard is now available to install from your blueprints. Learn how to install this dashboard from blueprints. A few things to note:  The reports on the dashboard have been built to support the needs of a wide audience and it’s likely that you will need to change the filters, views, and groupings in some reports. This is especially the case when it comes to filtering on status or date/time.   You may not need all the reports, so feel free to remove them. All reports have tags in their descriptions for searchability: #reviewandapprove #valuerealization Finally, we recommend making copies of these reports before editing in case you want to go back to the original reports.  The Review and Approve Reports: Document Versions by Version Number (Version Audit) Shows the number of document versions created over time grouped by version number. The filter only shows the most recent version. Fewer versions per document imply higher efficiency. Use this report to understand the volume of versions and decide if you need to streamline your review or intake processes. Document/Proof Collaboration by Document Name Shows consolidated feedback by surfacing the comments made on documents and proofs. High levels of commenting indicate strong team collaboration. Recommendation: We suggest you add a filter by Portfolio or Program to show only collaboration on certain projects. Proof Review Time by Approver Shows the number of proof decisions over time. Faster time to decision shows higher efficiency in approval time. Document Review Time by Approver Shows the number of document decisions over time. Faster time to decision shows higher efficiency in approval time. Projects with Approval Processes Shows the number of Project approvals both completed and in-flight. Faster time to decision shows higher efficiency in approval time. Tasks with Approval Processes Shows the number of Task approvals both completed and in-flight. Faster time to decision shows higher efficiency in approval time. Issues with Approval Processes Shows the number of Issue approvals both completed and in-flight. Faster time to decision shows higher efficiency in approval time. 

Blueprint Released: Value Realization | Core Value Dashboard

If you’re a system admin, product owner, champion, or manager of Workfront users, you’ve probably been asked to demonstrate the value Workfront is bringing to your business. We’ve identified five key areas that your business can realize value. Workfront provides the ability to: Centralize work in one solution Manage work processes Review and approve digital work Govern compliance workflows Deliver client-facing services We’ve created a dashboard that helps you identify value realized from the first and second bullet points and we call it the Value Realization | Core Value. These reports help you translate the benefits of centralizing work in one solution and managing work processes into measurable values that you can track over time to drive better outcomes. This dashboard is now available to install from your blueprints. Learn how to install this dashboard from blueprints A few things to note:  The reports on the dashboard have been built to support the needs of a wide audience and it’s likely that you will need to change the filters, views, and groupings in some reports. This is especially the case when it comes to filtering on status or date/time.   You may not need all the reports, so feel free to remove them. All reports have tags in their descriptions for searchability: #centralizework #manageworkprocesses #valuerealization Finally, we recommend making copies of these reports before editing in case you want to go back to the original reports.  The Core Value Reports: Active Projects by Portfolio & Program Shows all active projects (projects that are not in statuses that equate with Complete or Dead). Use this report to understand the work currently in progress across your organization. Active Projects by Progress Status Shows all active projects (projects that are not in statuses that equate with Complete or Dead). Use this report to understand what projects are on track to deliver on time. Active Projects by Owner & Status Shows all active projects (projects that are not in statuses that equate with Complete or Dead). Use this report to understand project ownership and progress/status of in-flight projects. Active Projects by Priority Shows all active projects (projects that are not in statuses that equate with Complete or Dead). Use this report to understand how in-flight work aligns to your priorities. Active Tasks Assigned to Direct Reports Grouped by Manager Shows all incomplete tasks, grouped by the assignee's manager and progress status. Use this report to visualize the workload of your direct reports. Projects Completed by Month & Group This Year Shows all completed projects this year. Use this report to understand the volume of completed work by group. Completed Projects by Portfolio & Progress Status Shows all completed projects, grouped by portfolio, month of completion, and progress status. Use this report to understand the overall volume of completed work across portfolios. Completed Parent Tasks by Progress Status Shows all completed parent tasks, grouped by month of completion, project, and progress status. Use this report to understand the overall volume of completed work and on-time delivery. Completed Child Tasks by Progress Status Shows all completed child tasks, grouped by month of completion, project, and progress status. Use this report to understand the overall volume of completed work and on-time delivery at the most granular level. Completed Issues by Month Shows all completed issues, grouped by month of completion and project. Use this report to understand the overall volume of completed issues and cycle time. Planned Hours & Duration vs. Actual on Completed Projects by Group Shows the variance of planned vs. actual of both hours and duration on completed projects. Use this report to analyze project cycle time and level of effort. In the variance columns, negative numbers indicate early completion or lower level of effort while positive numbers indicate lateness or higher level of effort. Project Duration Variances by Portfolio & Template Shows the difference in days between the projects' actual duration and the template duration, and the difference between the projects' actual hours and the template hours. In the variance columns, negative numbers indicate early completion or lower level of effort while positive numbers indicate lateness or higher level of effort. Number of Projects by Group & Template Shows all projects by group and template. Use this report to understand template usage across groups and analyze how many work processes are captured in Workfront. Planned Hours & Duration vs. Actual on Completed Milestone Tasks by Milestone Path Shows the variance of planned vs. actual of both hours and duration on completed milestones. Use this report to analyze milestone cycle time and level of effort. In the variance columns, negative numbers indicate early completion or lower level of effort while positive numbers indicate lateness or higher level of effort. Completed Milestone Task Duration Averages Shows all completed milestone tasks grouped by milestone name. Use this report to see the average planned and actual duration of your milestones on the Summary tab. Number of Requests Per Request Queue by Queue Topic This Year Shows how many requests have been submitted this year, grouped by queue and queue topic. Use this report to understand request volume. Report Usage Over Time Shows the sum of the reports being viewed by month. Use this report to understand user interaction with reports each month which is an indication of adoption. Collaboration by User Shows all the direct messages entered in the system and grouped by the users who entered them. Use this report to understand collaboration over time. If you want to see the messages, add a column for Note >> Note Text. Project Collaboration by Month Shows all the direct messages entered in the system at the project level and grouped by entry date month and project. Use this report to understand project collaboration over time. Task Collaboration by Project Shows all the direct messages entered in the system at the task level and grouped by entry date month and project. Use this report to understand task collaboration over time. Issue Collaboration by Project Shows all the direct messages entered in the system at the issue level and grouped by entry date month and project. Use this report to understand issue collaboration over time. 

Bot Name, Bot Page Views, and Bot Occurrences Now Available in Analysis Workspace

As of June 7, 2023, improved Bot Reporting is now available in Workspace. Adobe has now introduced a ‘Bot name’ dimension shows the names of bots that were detected using Bot rules. These rules can be default IAB rules, or custom bot rules that your organization configures. It is helpful in cases where you want to learn more about what bots are visiting your site, or which bots generate the most traffic.  The Bot Name dimension is an Adobe provided component. Using the Bot Name dimension, you will see the names of the bots detected or custom bot rules supplied. Bot Name should only be used with Bot Page Views and Bot Occurrences. If this dimensions is used with any other Analytics metric, no data will result because it is unassociated with normal customer-based data in Analytics. Data processing for Bot Page Views began between February 26 and March 10, 2023; as a result, any reporting window prior to this date will not have data. The Bot Page Views metric shows the number of times a bot was set or persisted on a page. The Bot Page Views metric should only be used with the Bot Name, Page, or standard time dimensions, such as Day or Week. Bot Occurrences shows the number of hits where a bot traffic was detected. Similar to the Bot Page Views metric, the Bot Occurrences metric should only be used with the Bot Name, Page, or standard time dimensions.  Please note that this dimension automatically collects data if you have enabled Bot rules. If you have not yet enabled Bot rules, this dimension does not appear in Analysis Workspace.

[Release Update] Adobe Journey Optimizer May 2023 Release

May 2023 Release Notes Check out what's new, improved, and fixed in the latest Adobe Journey Optimizer product release update: Release Notes New capabilities Content Experimentation in campaigns: Adobe Journey Optimizer now supports experiments in campaigns. Experiments are randomized trials, which in the context of online testing, means that you expose some randomly selected users to a given variation of a message, and another randomly selected set of users to some other variation or treatment. After exposure, you can then measure the outcome metrics you are interested in, such as opens of emails, subscriptions, or purchases. Create and use fragments in your email content: You can now author, use, and manage fragments to quickly assemble your emails and content templates. A fragment is a prebuilt reusable component that can be referenced in multiple emails across Journey Optimizer campaigns and journeys for an improved and accelerated design process. Use Tags in your campaigns (Beta): You can now assign Adobe Experience Platform Unified Tags to your campaigns. This allows you to easily classify them and improve search from the campaigns list. Note that Unified tags feature is currently in beta. Personalized Optimization AI ranking model (General Availability): Personalized Optimization AI ranking models are now generally available in Decision Management. This new type of model allows you to optimize and personalize offers based on segments and offer performance.   Please find the details around the improvements included in this release here: Latest Release   Feel free to reach out in case you have any questions and/or feedback through the below comment section.

What’s cooking! Share your best communication “recipes” with us today!

Good news for those familiar with our very popular Reporting Cookbook - we’re cooking up a new one!What’s a Workfront “Cookbook” you ask? It’s a collection of snackable, easily digestible, examples on different topics that you can take and test out in your own Workfront kitchen.This time we’re cooking up ideas for how Workfront System Admins communicate with their end users (think, how to share new features with users, or tips & tricks, or how to ask for feedback, etc.).There’s no “one size fits all” when it comes to communications. Some Admins communicate regularly, some only communicate when there are major changes. Some Admins communicate directly with end users, others communicate through Group Admins or SMEs. The Communications Cookbook will showcase real examples of how, when and why your peers are communicating with users. (We’ll sprinkle in some best practices along the way.)In order for this cookbook to become a family heirloom, we need YOUR examples by the end of May [DEADLINE EXTENDED] Friday, June 9, 2023!To submit your recipe, all you need to do is fill in the attached template and email it to Kristin Farwell at farwell@adobe.com. All recipes in the cookbook will be made public, so don’t include any confidential data in your screenshots, be sure to blur out names, etc.Need a little inspiration? Check out the attached example from our very own Workfront Admins at Adobe! If you have questions, drop a note below or email Kristin using the address above. So very excited to see what you all have to share! 

The new Work Time Field: Now you can adjust user capacity AND calculate availability based on the User’s Schedule

As of March 2, 2023, Workfront offers a new and improved way to calculate a user’s capacity using the Work Time field that will enable you to see the true availability of your global teams based on region-specific schedules.  Up until now, if you wanted to adjust the amount of time a user had each day to work on project work, you had to adjust the FTE (Full Time Equivalent) value.  This presented an obstacle when the User schedule was used versus the Default schedule to calculate resource availability because in this case the FTE value was ignored.     Now, with the new Work Time field located in the user profile, you can adjust capacity and still have resource availability calculated using the User’s schedule. The available hours in the Resource Planner and Workload Balancer will reference the user’s schedule and reflect availability based on the Work Time value entered.    The Work Time field is adjusted by entering a decimal value between 0 and 1.    To account for all work hours each day, the value should be “1” (the default value)  Example: If a team member with an 8-hour workday can devote all 8 hours to project work, the value entered should be “1,” which is the default value - the resource management tools will reflect 8-hours of availability.     To reduce the amount of time a user can devote to project work each day, use a decimal value between zero and 1.  Example: If a team member with an 8-hour workday can only devote 6 hours of each day to project work, enter “.75” - the resource management tools will reflect 6-hours of availability.    Check out the Work Time field and test its ability to adjust capacity in your Preview environment or with a few select users in your Workfront production environment.         

New Commenting Beta experience

Hi All,   We are excited to announce that we have enabled a new Commenting Beta experience in your Preview environments. The experience is aimed at providing better performance and enabling the delivery of modern commenting features and functionalities. It is available for the issue object only for now and will be gradually enabled for other Workfront objects and locations.   The toggle is planned to be made available in production with 23.2 release, so we invite you to test the new experience and share your valuable feedback. There is an in app feedback button when you switch to the Beta experience.   Some of the long awaited feature enhancements are available with this experience: Real time updates Editing comments Removing people from the threads Seeing comments separately from the system updates We would also like to get your feedback on the usage of the following features that are located on the current and are not available in Beta commenting experience: Log Time Editing Custom form Updating task and issue status related data Condition Percent Complete Planned Completion Date We are considering changing the experience with these so would like to get your opinion. We would really appreciate if you could provide your feedback by filling out this 5 minute survey  or jumping on a 30 minute call where we could discuss the feedback or more specific use cases in more details.    Thank you in advance for your time and feedback!