News
The Future Group Unveils Significant Mixed Reality Advances in Pixotope Version 1.3 Posted: 23/07/2020
The Future Group Unveils Significant Mixed Reality Advances in Pixotope Version 1.3

The Future Group, creators of live photo-realistic virtual production system Pixotope®, today unveils its latest Version 1.3 software featuring a wide range of advances that significantly improve how virtual environments interact with real-world elements.

Pixotope enables the production of mixed-reality (MR) content by bringing together physical components such as presenters, actors, props and free-moving cameras, with virtually created assets such as scenes, graphics, animated characters, or any other computer-generated elements. Pixotope forms the central production hub when creating mixed-reality content for broadcast and live events, with Version 1.3 offering new object tracking, powerful lighting integration, enhanced colour management and more.

 

Tags: The Future Group | Pixotope | Version 1.3 | virtual elements | object tracking | virtual production | mixed-reality | Oystein Larsen | lighting control | colour management | Unreal game engine | Marcus Blom Brodersen | MKM Marcomms | MKM Marketing Communications
Submitted by MKM Marketing Communications
More from this author
Mo-Sys introduces the one-stop 4K virtual studio Posted: 16/07/2020
Mo-Sys introduces the one-stop 4K virtual studio

Mo-Sys, a world leader in precision camera tracking solutions for virtual studios and augmented reality, has brought virtual studio production within reach of everyone with StarTracker Studio, the world’s first pre-assembled production package. The system is scalable to any size production, and can support 4K Ultra HD.

Critical for virtual studio and augmented reality production is to track the position of each camera in three-dimensional space across all six axes of movement. StarTracker from Mo-Sys is proven as the most precise and reliable camera tracking package, using dots on the studio ceiling (“stars”) which are placed at random and tracked to plot camera positions with extraordinary accuracy.

 

Tags: mo-sys | camera tracking | augmented reality | virtual studio production | StarTracker Studio | 4k | ultra HD | camera | mounts | virtual graphics | Unreal Engine | Ultimatte keyer | lenovo | rack | blackmagic | canon | cartoni
Submitted by Manor Marketing
More from this author
Mo-Sys puts the crowd back into sport Posted: 19/06/2020
Mo-Sys puts the crowd back into sport

London, UK, 19 June 2020: Mo-Sys Engineering, a global leader in real time camera tracking and remote systems, has announced a revolutionary approach to bringing the atmosphere back to live sport amid covid-19 restrictions. Providing precision, zero-latency tracking for any camera (including ultra-long box lenses for sport), the Mo-Sys camera tracking kit interfaces directly to the Unreal Engine or any broadcast render engine, allowing production companies to add virtual crowds to stands.

“After so many weeks, sports fans are desperate for any action,” said Michael Geissler, CEO of Mo-Sys. “But the frustration will turn to disappointment if the atmosphere of the game falls flat because of empty stands. We have developed a camera tracking kit which any outside broadcast can implement quickly and simply, capable of filling the stands with a virtual, but enthusiastic, crowd.”

The Mo-Sys camera tracking encoders are quickly mounted onto broadcast standard Vinten Vector heads, with no impact on the camera’s perfect balance and no backlash when panning and tilting. Zoom data is collected either by gear encoders or by a serial data link to digital lenses. The combined tracking data is sent over ethernet to the workstation hosting the augmented reality software.

“We are known for the absolute precision and stability of our camera tracking – that’s why Hollywood relies on our technology,” Geissler added. “In this application, we deliver precise tracking, including compensation for lens distortion, even when a 100:1 lens is zoomed fully.”

Mo-Sys has worked with Epic Games to develop a tight interface to the Unreal Engine, including support for the latest version 4.25 software. The result is that highly photo-realistic augmented reality – such as crowds filling the stands – can be integrated into live production with no limitations and negligible latency. The kit includes the bolt-on encoding kit for Vinten heads and the lens calibration tools.

Users can see the technology in action in a Mo-Sys LiveLab webinar, which will also include contributions from Epic Games and Canon. The webinars are on 30 June, at 10.00 (register at https://bit.ly/2N1Ve44) and repeated at 18.00 (register at https://bit.ly/30GqFZP). Michael Geissler of Mo-Sys will also join a distinguished panel with the RTS Thames Valley Creative Centre’s look at production techniques for audience shows in a time of pandemic on Thursday 25 June at 17.00 (register free at https://rts.org.uk/event/future-studio-audience).

###

About Mo-Sys:

Mo-Sys is a world-leading provider of camera tracking and camera robotic systems supplying to broadcasters such as BBC, Sky, Fox, ESPN, CNN Discovery Channel, The Weather Channel and Netflix among many more. With a passion for innovation and design, Mo-Sys is at the forefront of live AR and virtual production technology with their StarTracker camera tracking system now powering more than 100 virtual TV studios around the world. For further information, www.mo-sys.com

Mo-Sys Company Contact:

Adam Smith, Marketing

adam@mo-sys.com

+44 (0) 208 858 3205


Mo-Sys Media Contact:

Jennie Marwick-Evans

jennie@manormarketing.tv

+44 (0) 7748 636 171



Tags: Mo-Sys Engineering | Camera Tracking | :ive Production | zero-latency | camera tracking kit | Unreal Engine | Render Engine | Augmented reality software | Epic Games | Live Production | Vinten Heads | Canon
Submitted by Manor Marketing
More from this author
Pixotope powers stunning visuals and engaging information in WePlay Esports championship streaming Posted: 09/04/2020
Pixotope powers stunning visuals and engaging information in WePlay  Esports championship streaming

The Future Group, developer of live photo-realistic virtual production system Pixotope™, today announced that it provided the mixed reality platform for WePlay! Esports’ ‘Dota 2 Tug of War: Mad Moon’ tournament, which took place in Kiev, Ukraine, from 19-23 February. WePlay! Esports, which organises tournaments as well as providing online coverage, has the vision to blend competitive esports with entertainment in all-embracing coverage – “esportainment” is the word coined by Oleg Krot, CEO at WePlay! Esports.

To bring the audience into the esports action, WePlay! Esports decided to use augmented reality. “For us, augmented reality is a powerful tool to enhance the tournament theme, and add creativity and consistency to the broadcast,” said Aleksii Gutiantov, Head of AR Dept at WePlay! Esports. He turned to the proven Pixotope solution from The Future Group: “the best product on the market in my experience,” Gutiantov added.

Pixotope takes the remarkable rendering power of the Unreal Engine from Epic Games, and builds it into a complete virtual production hub. While offering a user interface simple enough for the pressures of live production, it provides along with the ability to make last minute changes while in live mode, all the power of an augmented reality system including camera tracking, motion capture and data-driven graphics.

 

Tags: The Future Group | Pixotope | WePlay! | Esports | championship | photo-realistic | virtual production | mixed reality | esportainment | Oleg Krot | Aleksii Gutiantov | Unreal Engine | Epic Games | Maksym Bilonogov | The Mad Moon | Marcus Blom Brodersen | MKM Marcomms
Submitted by MKM Marketing Communications
More from this author
Mo-Sys Transforms Remote Production and Virtual Studios Posted: 11/03/2020
Mo-Sys Transforms Remote Production and Virtual Studios

Mo-Sys Engineering, a global leader in real time camera tracking and camera remote systems, can now remotely operate cameras on the other side of the world, without perceived delay, with the launch of TimeCam.

TimeCam represents a triple benefit to production companies. First, there is the saving in cost and environmental impact in sending camera operators to site. Second, it means that the most in-demand operators can be much more productive, providing excellent coverage at a live event each day rather than losing time through travel. Third, it means that you can add cameras to your coverage without adding headcount: for instance, a downhill ski race might have eight cameras along the course, with one operator controlling cameras 1, 3, 5 and 7 and a second controlling 2, 4, 6 and 8.

“’Traditional’ remote production puts the control back at base, but still needs camera operators to travel to the location,” explained Mo-Sys CEO Michael Geissler. “By compensating for latency in transmission and compression/decoding, TimeCam means that operators too can stay at base and be much more productive by operating on several events, when normally they could only be on one.”

As well as unveiling TimeCam, Mo-Sys will also demonstrate its flagship virtual studio technology StarTracker, which uses a constellation of dots on the studio ceiling as a camera tracking system. This is now extensively used by many prestige broadcasters to provide the tracking for augmented reality studios and is increasingly being built-in to studio cameras.

Whereas most virtual studios use a proprietary graphics system which in turn uses the Unreal gaming engine from Epic Games, StarTracker Studio (in a 19” rack) now features the plug-in Mo-Sys VP which we like to call Unreal Unleashed, a direct interface between the camera tracking and the UE4 render engine. Through the plug-in, control is direct, and no other software layer is wrapped around the Unreal Engine allowing full access to the latest UE4 features.

“Both TimeCam and StarTracker with Unreal Unleashed are transformative technologies, capable of bringing new creativity and productivity to the broadcast and movie worlds, allowing sophisticated productions around the globe,” Geissler said.

www.mo-sys.com

Tags: Mo-Sys Engineering | NAB | real time camera tracking | camera remote systems | TimeCam | virtual studio | augmented reality | latency | transmission | compression/decoding | StarTracker | Mo-Sys VP .Unreal Unleashed | UE4 render engine | turnkey | NAB 2020 | NAB 2020
Submitted by Manor Marketing
More from this author
Creative Works London delivers immersive and spectacular visuals to Guns N Roses North American Tour using disguise and Unreal Posted: 05/03/2020
Creative Works London delivers immersive and spectacular visuals to Guns N and rsquo; Roses North American Tour using disguise and Unreal

When Guns N’ Roses’ successful ‘Not In This Lifetime’ tour switched to disguise for its North American leg, it opened up new creative possibilities to enhance the already visually spectacular rock show.

The tour wrapped last November with two shows at The Colosseum in Caesars Palace, Las Vegas. The three-year world tour played 158 concerts selling more than 5 million tickets and grossing $584 million. The ‘Not In This Lifetime’ tour that started in 2016 marked the first time that Axl Rose, Slash and Duff McKagan played together since 1993. Screenworks-NEP Live Events was the disguise rental partner for the tour.

Jeremy Leeor, Managing Director and Dan Potter, Creative Director, at disguise studio Creative Works London, coordinated the creation, execution and delivery of the Ghostrain and traditional show walk-in opening sequence using the new disguise and Unreal integration, currently going through Beta testing.

"A lot of what we created for the Guns’ tour visuals was built similarly to the way triple-A game assets are made using highly creative and flexible 3D tools like Cinema 4D, ZBrush, 3Ds Max and Substance Painter to create high-quality assets that can then be moved into real-time render engines like Unreal Engine,” explains Dan. “This is how we’ve always done it, and it makes a lot of our content stand out for its depth, texture and richness."

He points out that, “the rendering process on any project, in a conventional sense, presents many render engine and creative compositing options, and is central to how we achieve the diverse range of visual styles in our work. With the introduction of real-time render engines, you’re shifting the focal points of stages when you are developing a look to different points in the project timeline. This can be quite liberating if you’re working on something that plays to the strengths of the look you want to achieve and using the right tools to do it."

Tags: Guns N Roses | disguise | Creative Works | LED screens | Unreal | 3D | Cinema 4D | ZBrush | 3Ds Max | NEP Live Events | concert touring | Bubble Agency
Submitted by Bubble Agency
More from this author
Viz Engine 4 released ahead of IBC2019 Posted: 09/09/2019
Viz Engine 4 released ahead of IBC2019

Vizrt, the innovator of software-defined visual storytelling (#SDVS) platforms, has released Viz Engine 4. This much anticipated release revolutionizes how virtual studio sets and augmented reality graphics are designed, rendered and combined with live video, and focuses on driving the complexity out of production workflows.

Tags: Vizrt | Unreal Engine 4 | The Big AR Sports Show | AR | virtual studios | Fusion Keyer | IBC 2019 | IBC 2019
Submitted by Bubble Agency
More from this author
ChyronHego Launches a Fresh Take on AR and Virtual Set Graphics for News, Weather, and Sports Posted: 09/04/2019
ChyronHego Launches a and quot;Fresh Take on AR and Virtual Set Graphics for News, Weather, and Sports
ChyronHego today introduced Fresh, an all-new graphics-rendering solution that integrates Epic Games' Unreal Engine 4 (UE4) with ChyronHego's augmented reality (AR) and virtual set (VS) software. Fresh makes news, weather, or sports look more real than ever because the AR graphics — including text and titles — are all completely integrated into the Unreal scene.
Tags: ChyronHego | Fresh Take | AR | Virtual Set Graphics | News | Weather | Sports | Graphics | Unreal Scene | NAB 2019 | NAB 2019
Submitted by Wall Street Communications
More from this author
Unreal Engine Supports DeckLink Products, DeckLink SDK Available from Unreal Marketplace Posted: 09/01/2019
Unreal Engine Supports DeckLink Products, DeckLink SDK Available from Unreal Marketplace
Blackmagic Design announced today Epic Games’ Unreal Engine 4.21 now supports Blackmagic Design’s DeckLink 8K Pro, DeckLink Duo 2 and DeckLink 4K Extreme 12G capture and playback cards. DeckLink SDK binaries and source code from Epic will also now be available free to download on the company’s Unreal Engine Marketplace.
Tags: Epic Games | Unreal Engine 4 | Gaming | DeckLink | SDK | Unreal Engine Market Place | Blackmagic Design
Submitted by Blackmagic design
More from this author
ChyronHego and Epic Games to Integrate Unreal Engine With ChyronHego AR and Virtual Set Software Posted: 09/04/2018
ChyronHego and Epic Games to Integrate Unreal Engine With ChyronHego and rsquo;s AR and Virtual Set Software

ChyronHego today announced a partnership with leading game developer Epic Games to integrate the Unreal Engine with ChyronHego’s family of augmented reality (AR) graphics and virtual set solutions. With the integration, news broadcasters and other customers of ChyronHego’s Neon and Plutonium software will be able to leverage Unreal’s industry-leading rendering and real-time special effects capabilities to add powerful new photorealistic and hyper-realistic elements to their on-air virtual sets.

Tags: ChyronHego | Unreal Engine | Epic Games | augmented reality (AR) graphics | virtual set solutions | NAB Show 2018 | NAB 2018 | NAB 2018
Submitted by Dundee Hills Group
More from this author
Epic Games Announces Unreal Engine Integration With Strategic Partners in Broadcast Posted: 25/04/2017
Epic Games Announces Unreal Engine Integration With Strategic Partners in Broadcast

At NAB 2017, Epic Games will reveal applications of Unreal Engine that deliver unprecedented speed, fidelity and flexibility in broadcast production workflows. Ross Video, Vizrt and Zero Density will demonstrate integrations with Unreal Engine that illustrate the power of the engines real-time rendering capabilities. In addition, The Future Group and House of Moves will showcase how Unreal Engine is helping to break new ground in episodic entertainment production and delivery at Epics first Unreal Engine NAB press conference (April 24, 4-5PM, Room N239, Las Vegas Convention Center).

Tags: Unreal Engine | EPIC | NAB 2017 | The Future Group | NAB 2017 | NAB 2017
Submitted by Epic Games
More from this author
Ncam demonstrates integration with Unreal Engine at NAB 2016 Posted: 30/03/2016
Ncam demonstrates integration with Unreal Engine at NAB 2016

Ncam, global leader in augmented reality for television and film production, will be demonstrating ground-breaking advances in creative capabilities at NAB2016 (18 21 April, Las Vegas Convention Center). By combining Ncams unrivalled technological capabilities with Epic Games Unreal Engine, augmented reality takes a massive leap forward into photorealism.

Tags: Ncam | AR | Augmented Reality | Photorealism | Depth | Epic Games | Unreal Engine | NAB | NAB 2016 | NAB 2016
Submitted by MKMMarcomms
More from this author
Pages:   [1]