On the eve of the IBC Show, Adobe introduced new video features Sept. 12 that it said will be made available later this year for the next version of its Creative Cloud set of applications and services.
The updates include animation powered by the company’s Adobe Sensei artificial intelligence (AI) and machine learning (ML) technology, along with end-to-end virtual reality (VR) 180 support. The new features will allow filmmakers and video professionals to “spend more time shaping their next creative project and less time on repetitive editing tasks,” according to the company.
Creative Cloud users will be able to “mold layers into new, dynamic shapes with new Mesh Sculpting tools that twist, bend and scale under your creative control” in After Effects, Adobe’s visual effects and motion graphics software for Creative Cloud, it said in a news release. By “leveraging” AI and ML in Sensei, users will be able to “instantly create and animate unique, stylized puppets using a webcam and reference artwork” with Character Animator’s new Characterizer, it said.
The new support features for 180-degree immersive video in Adobe Premiere Pro video editing and production software and After Effects, meanwhile, include optimized ingest, effects and output in Google VR 180 for viewing on YouTube or other platforms, Adobe said.
“We’re living in an age where there’s never been more video storytelling,” Bill Roberts, senior director of product management for video at Adobe, told reporters in a recent online news briefing ahead of the announcement.
“The boundaries of every format – from documentary to drama – keep getting pushed,” he said, adding: “Every side of the equation from animation through to live action is out there pushing new boundaries…. We’re really at the cusp of innovative storytelling and, as the volume increases and the demand to be more creative increase, there’s a need to do more.”
One of the themes that Adobe stressed in the briefing was that the company is “trying to deliver smarter tools to allow people to get more of their ideas onto the screen,” he said.
One “pillar” of the latest Creative Cloud release this year is the need to “work smarter, not harder,” so Adobe was out to figure out “how can we use AI and our Sensei technology to take away the drudgery and allow you to stay focused on creative storytelling,” he told reporters.
He explained: “When we think about this new reality of video… it’s not just about mastering something for the silver screen or for broadcast. It’s really about connecting with your audience everywhere they are – making sure you can get all of your content out to platforms like Facebook, YouTube and Twitter.”
“Sensei is sprinkled through everything,” he said of the Creative Cloud update. While its Character Animator software was already “built around a bunch of AI that was developed inside of Adobe,” he said, “now, we’re applying it to actually creating puppets in a simple and easy manner.”
Meanwhile, “as we start to embrace the world even further in VR and 360 content, you’ll see an increase in how we’re dealing with depth inside of products like After Effects,” Roberts told reporters.
At the VidCon conference in Anaheim, Calif., June 19, Adobe introduced Project Rush, a new app that it said automatically syncs all of one’s digital workflows to the cloud, enabling them to be used anywhere, on any device.
Noting that was a “new area for us,” Roberts told reporters during the IBC briefing that Project Rush was “targeted at a different type of user than people would normally think about when you’re thinking about” Premiere, After Effects and Adobe Audition audio editing software.
He went on to say: “If we look at the new reality of video creation, there is a new class of creator. These are people who want to express and share their ideas on platforms like YouTube and Facebook. But it’s really about their ideas, and video is just a conduit. So, what they want is an all-in-one cross-device application that allows them to go from [the] shoot to distribution and have all the tools presented in a simple framework.” Project Rush enables that and “everything you do in [it] can move forward into Premiere Pro, so you can finish your editing experience. But it allows a new class of people to communicate deeply and effectively with video,” he said.
Project Rush is available in beta now and will be made available later this year, Adobe said.
“Video professionals face short deadlines, clunky handoffs and long lists of deliverables,” Steve Warner, VP of digital video and audio at Adobe, said in the Sept. 12 news release. He added: “This latest Creative Cloud release introduces new innovation and capabilities to address these challenges and make common tasks faster and easier.”
Also new to Creative Cloud will be the ability to improve Adobe Stock workflows. Users will be able to “search and sort millions of curated, contemporary 4K and HD cinematic footage and professionally-designed Motion Graphics templates, right from” the Essential Graphics panel in Premiere Pro and After Effects, Adobe said.
Other new capabilities coming later this year to Creative Cloud include: the ability to instantly improve audio with intelligent cleanup tools DeNoise and DeReverb in the Essential Sound panel; the ability to take the guesswork out of curve adjustments and “bring simplicity and precision to selective color grading and color management” with new Lumetri Color tools in Premiere Pro and After Effects; the ability to drag and drop spreadsheet files to Motion Graphics templates to generate visual representations of information within video projects using data-driven infographics in Premiere Pro; and the ability to collaborate seamlessly, according to Adobe.
IBC attendees will be able to get a closer look at the new features at the Adobe booth (#7.B35, Hall 7, RAI Amsterdam) and at more than 100 partner booths, Sept. 13-18, Adobe said. For more information on pricing, visit https://www.adobe.com/creativecloud/plans.html.
Multiple subscription plans for Adobe Stock are available at https://stock.adobe.com/plan