Main content

Adobe Summit - generative AI takes a front-row seat

Barb Mosher Zinck Profile picture for user barb.mosher March 22, 2023
Summary:
The Adobe Summit is in full swing, and there’s no shortage of product announcements. As we dive into what’s new, we start to see the impact of generative AI on creating and driving better customer experiences across all channels

adobe
(twitter)

Adobe used its Summit in Las Vegas to announce Adobe Sensei GenAI, a set of generative AI services that will be natively integrated into many of their products, including Adobe Experience Manager (AEM), Journey Optimizer, Journey Analytics, Marketo Engage, and Real-time CDP.

Adobe refers to the services as a “co-pilot,” much the same way Microsoft does. It said multiple LLMs (large language models) would be used, including ChatGPT through Microsoft Azure OpenAI Service and Google Research’s FLAN-T5 (Fine-tuned LAnguage Net - Text-To-Text Transfer Transformer). The models used will depend on business needs “stemming from brand guidelines, product vocabulary, and customer insights.”

The first of these services is Adobe Firefly. With Firefly, users can use their own words to generate new images, audio, vectors, video, and 3D and use creative elements like brushes, color gradients, and video transformations.

Firefly will be integrated into Adobe Creative Cloud, Document Cloud, Experience Cloud, and Adobe Express, with the first integrations being Adobe Express, Experience Manager (AEM), Photoshop, and Illustrator. Also, the first release, launched in beta now, only enables you to quickly generate high-quality images and text effects.

This first model is trained on Adobe Stock Images (which include hundreds of millions of assets), openly licensed content, and public domain content where the copyright has expired. Future models will leverage other assets, technology, and training data from Adobe and others.

For creators who don’t want their work used in training AI models, a new “Do Not Train” tag can be applied, which will stay with the content where it’s used, stored, or published. Adobe is also working on a compensation model for Adobe Stock contributors, but the details won’t be released until Firefly is out of beta.

For companies who want to ensure content is on brand, there will be the ability to train Firefly with their existing collateral to ensure content is generated following brand language and guidelines.

All content generated will be safe for commercial use. Eventually, Firefly will be made available to other platforms via APIs.

Enhancing Experience Manager

Continuing with the generative AI theme, Firefly will be integrated directly into Adobe AEM Assets - its digital asset management system. This direct integration makes it easy for teams to instantly change image components and automatically generate asset variations for different channels.

But that’s just the start. Companies will be able to create a language model that trains on customer data and content, enabling a range of capabilities through Adobe’s products that should improve experiences across the board.

For example, within Adobe Journey Optimizer, marketers can create message variations for email and mobile. They will be able to edit and rephrase copy by selecting a tone of voice or indicating key words. The same also holds for creating and managing website copy in Adobe AEM Sites.

In addition, Market Engage is positioned as leveraging GenAI to power Dynamic Chat, and Adobe Journey Analytics will get intelligent captions and text-based descriptions for key takeaways on visualizations such as cohort tables and fallout charts.

Adobe Sensei is also pitched as bringing new insights to content analytics by helping marketers understand how content performs at the attribution level. One example they gave is learning that east coast gen Z women respond best to orange color tones and a casual voice.

The ability to understand how content is performing at this level would help marketers segment audiences at a deeper level, leading to more personalized campaigns and experiences. This is a good thing because personalizing experiences by segments today are often too high-level to ensure accuracy and effectiveness.

Another new capability in AEM is the ability for content creators to create, edit, and publish content through MS Word or Google docs. Content templates are used, and security is applied to ensure only the right people can publish content, but this is an interesting capability that should ease publishing workflows. There’s no AI mentioned here, but it stands to reason that we should see it come into play inside Word and Google Docs as they continue to grow their built-in generative AI capabilities.

Adobe also announced a new product at the Summit - Adobe Express for Enterprise. Express itself is not new; it’s a design tool built for non-designers. This new enterprise version is integrated with AEM and uses Firefly.

Supporting the content supply chain

If you are in marketing, you understand the work involved in building campaigns and experiences, including everything from developing images, videos, new copy, and other content and ensuring it all comes together properly to meet deadlines. Multiple teams, including external resources, typically perform this work, and it gets complicated to manage it.

Adobe says it has the answer to this challenge with its Content Supply Chain, which brings together the Creative Cloud for Enterprise, Adobe Workfront (project and campaign management), AEM (Sites, Assets, segments, and content profiles), Express for Enterprise and Frame.io (video collaboration).

Essentially, it’s integration between the tools to ensure smoother workflows. For example, there’s a Workfront plugin in the Creative Cloud so designers can see the work assigned to them and submit their work for review without leaving their design tools. Adobe also plans to integrate Sensei GenAI into creative and marketing workflows to help create variations quickly and speed up testing and adapting.

New product analytics

Adobe is helping bring product teams and marketing together through its new Adobe Product Analytics in Experience Cloud. It gives product teams self-service capabilities to learn how their products are perceived and used, helping them fine-tune product-led growth by improving the product based on that feedback.

Product managers can use product analytics to find patterns and audience trends, analyze customer engagement over time, and so on. Along with helping to improve product roadmaps based on what customers actually use and want, marketers and product managers can connect these insights to action through Journey Optimizer, bridging the traditional gap between product and marketing teams..

Innovations for Real-Time CDP

The last piece of news I’ll talk about is a set of innovations for Adobe’s CPD product, some of which include AI enhancements. First, the CDP will leverage GenAI to automatically generate audience segments, similar to how Adobe Analytics will do it. Again, the ability to generate well-defined segments will improve personalization significantly.

New Generative Playbooks will provide use case templates built on Adobe’s expertise, proprietary data, and experiences across industries. Marketers will also be able to leverage conversation-based AI to author new use cases and request suggestions for alternative use cases.

The CDP is also helping to improve account-based marketing efforts by enabling marketers to connect with unknown prospects within known accounts through channels like Marketo Engage and LinkedIn.

Adobe is dealing with the end of third-party cookies through new partnerships with Merkle and Epsilon to help enrich first-party data. In addition, it has added new channel integrations with Amazon Ads and TikTok and new integration with LiveRamp for connected TV and audio. Finally, there’s a new integration with Adobe Commerce to leverage CDP-stored customer preferences in shopping experiences and to gain new insights into shopping behavior.

My take

Any tool that helps content creators work better is a good tool to have. ChatGPT opened many possibilities to work smarter and faster (it scared a lot of creators too, but that’s another story). Adobe talks about the increasing demand for content and the impact that demand has on companies. Teams are getting smaller and budgets tighter, so meeting these demands is putting a lot of stress on workers.

But there’s also the need for the right content; content that supports what audiences and customers want. And figuring out what the right content is, is a challenge in itself. Generative AI helps here by helping create variations that are tested and adapted until the right content is there.

Adobe is on target to both create content and measure content performance. It’s a lot of talk right now, with only a portion of what’s been announced available (and some in beta), but that seems to be the norm these days - to announce before it’s actually there. However, at the rate things are changing with AI, it probably won’t be that long of a wait. 

[Updated later on day of publication to add further content]

Loading
A grey colored placeholder image