Snapchat’s parent firm Snap on Tuesday released its new generation generative Artificial Intelligence technology that will enable users see more natural special effects while filming through phone cameras, in a bid to counter other social media competitors.
Snap has been at the forefront in the market of augmented reality or AR, a technology that places computer graphics on the real-world scenes of photos or videos. Though Snapchat is still small relative to Meta, but it believes that expanding the range of more sophisticated and playful Augmented Reality filters, or lenses, will bring additional users and advertisers to the platform.
AR developers are now able to develop AI-powered intelligent lenses and people who use Snapchat will be able to apply them to their content, as per the company.

Snap Inc, based in Santa Monica, California also launched an improved version of its developer program titled “Lens Studio“, available for artists and developers that will help them to create AR features for Snapchat or any others website and apps.
Bobby Murphy, Snap’s CTO has noted that with the new Lens Studio, the creation of AR effects will possibly cut down from weeks to mere hours and yield better work. This upgrade also allows the production of more intricate/complex AR work. Murphy said that these tools provide more creative and innovative opportunities for users thereby preserving their simplicity of use; thereby enabling more unique AR effect even for new users.
The new Lens Studio now includes a range of generative AI tools. Among these is an AI assistant designed to help developers by answering their questions. Another notable tool allows artists to generate three-dimensional images using text prompts, without the need to manually create 3D models.
Previously, AR technology was comparatively basic i.e. limited to simple effects, such as adding a hat to a person’s head in a video. Snap’s recent innovations enable the creation of more sophisticated lenses that can, for example, ensure a hat moves in sync with the person’s head and can adjust to the lighting conditions in the video. Murphy also spoke about what might come next like, full-body AR experiences that can generate whole new outfits, which is currently very challenging.
For more such news, visit tech-news.in