RealityServer with support for NVIDIA RTX technology is here! This release includes a major Iray version bump which adds support for accelerating Iray rendering with the new RT Core hardware inside RTX cards and the Tesla T4. There is also a great performance improvement for Iray Interactive (on all cards) and support for MDL 1.5. Let’s take a look.
NVIDIA RTX technology was announced late last year and has gathered a lot of coverage in the press. Many software vendors have been scrambling to implement support for it since then and there has been a lot of speculation about what is possible with RTX. Now that Iray RTX is finally about to be part of RealityServer we can talk about what RTX means for our customers and where it will be most beneficial for you.
Our next RealityServer update is here. This is an incremental release with quite a few requested fixes and enhancements but also contains a few great new features for heavy users of MDL materials. This will be the last RealityServer 5.2 release as we will shortly be releasing RealityServer 5.3 with NVIDIA RTX support so watch out for that one. In the meantime, let’s checkout whats new in this version.
The Timex Group has joined the growing list of household names who use RealityServer to create the imagery that pushes customers to choose their products over a competitors. The retail sector demands the highest quality imagery in order to replace traditional photography with photorealistic 3D rendering. Not just quality, but speed and scale in everything from real-time, interactive and offline rendering.
Our first update for RealityServer 5.2 is here. It includes an Iray version bump and some nice convenience features. The most significant feature however is support for queuing renders with Iray Server. This will be of interest to those building internal rendering automation tools with RealityServer.
RealityServer 5.2 is here and adds some great functionality. Hugely expanded glTF 2.0 importer support, wireframe rendering, lightmap rendering, section plane capping, MDL 1.4 support, UDIM support and many more features have been added along with many fixes and smaller enhancements based on extensive customer feedback. In this post we will run through some of the most interesting functionality and how it can help you build your applications.
We’ve covered server-side V8 commands before but in this post we will go into a little more detail and use some of the helper classes that are provided with RealityServer to make common tasks easier. Quite often you want to kick off an application by creating a valid, empty scene ready for adding your content. Actually, it’s something we we need to do in a lot of our posts here so to avoid repeating it each time, lets make a V8 command to do it for us.
A core concept in RealityServer which many new users have some difficulty understanding is Scopes. The use of scopes is critical in making effective use of RealityServer in a production environment where multiple users or multiple independent operations are happening at once. In this article we will go into more depth on what scopes are and how to use them.
In this article we’ll take a quick look at how to use the UAC system in RealityServer to effectively manage user sessions and clean up server memory when users go away. There won’t be a lot of pretty pictures (well there is one if you make it to the end) but for those of you getting your hands dirty with RealityServer in production, you’ll get some valuable pointers to help keep your server from filling up with unused data.
In our last post we explored using the RealityServer compositing system to produce imagery for product configurators at scale. Check out that article first if you haven’t already as it contains a great introduction to how the system works. In this follow up post we will explore the possibilities of using the same system to modify the lighting in a scene without having to re-render, allowing us to build a lighting configurator.
RealityServer 5.1 introduced a new way to generate images of configurations of your scenes without the need to re-render them from scratch. We call this Compositing even though it’s actually very different to traditional compositing techniques. In this article we will dive into the detail of how to use the new system to render without rendering and speed up your configurator.
19 September, 2018, London — Project 424 welcomes its first brand technology partners to the team as the development of the world’s first all-electric and autonomous Le Mans Prototype race car enters an exciting new phase of development.
The trio of new partners, Onshape, SimScale and migenius, will form an integral part of Project 424’s overall development through the provision of high-performance, cloud-based tools for the design, simulation and 3D prototype renderings of this unique Le Mans Prototype race car.
TapGlance is a powerful and intuitive interior design app. Within minutes and without any prior experience, you can create photo-realistic images of just about any interior design project you have in mind.
Drag and drop furniture, fixtures and appliances into your plan – more than 2000 items are included with the app for free. Test material combinations using over a thousand included materials or import your own seamless textures or camera photos.
RealityServer 5.1 Update 251 has just been released. It’s mainly a bugfix release but adds a few nice extras that several customers have asked about, including a PBR MDL material, Iray Viewer loader and commands for manipulating measured BSDF data.
RealityServer 5.1 Update 227 has just been released and it has some great new features. A new Iray version with improvements to the AI Denoiser, an easy to use compositing system and a bunch of new convenience commands. This post gives an overview of the new functionality, however the compositing features are so significant that we are currently writing a dedicated article for that feature which will be out soon.
Come and meet us at NVIDIA’s annual GPU Technology Conference in San Jose 26-29th March, 2018. NVIDIA’s theme this year is “AI & Deep Learning” and we’ll be on the ‘Iray Plugins’ booth #826, where you’ll be able to see examples of Iray images rendered with the latest AI De-Noising technology. AI now accelerates de-noising of Iray images by a factor of up to 10x.
Bloom Unit, our physically accurate, photorealistic renderer for SketchUp has just won another award. The LIT Lighting Design Awards selected Bloom Unit for an award in the Innovative Lighting Design Software Applications category, making this the third lighting industry accolade to be handed out for a RealityServer based product. Behind the scenes Bloom Unit uses RealityServer as its core rendering engine and much of its functionality is inherited from this platform. It’s great to see recognition of the difference physical accuracy makes in the usefulness of rendering for this industry.
We recently released RealityServer 5.1 build 2017.173. This is mainly an incremental and bug fix release but adds some cool features some customers have been waiting for, including multiple UV sets, materials and holes for the generate_mesh command, updated AssImp plugin, a new Smart Batch command and render loop improvements.
RealityServer 5.1 introduced new functionality for working with canvases in V8. In this post I’m going to show you how to do some basic things like resizing and accessing individual canvas pixels. We’ll build a fun little command to render a scene and process the result into a piece of interactive ASCII art. Of course, this doesn’t have much practical utility but it’s a great way to learn about this new feature!
RealityServer 5.1 is here and it has something a lot of users have been asking about. This release adds the new AI Denoising algorithm for fast and high quality denoising of your images using state of the art machine learning technology. You really need to try it to fully appreciate the performance benefits however we’ll show you a few images to give you a feeling for what it is capable of. We are also adding support for the new NVIDIA Volta architecture and as usual a range of other smaller enhancements.
Starting November 20th 2017 migenius has taken over the support and development of the Iray for Rhino plugin. We have been big fans of Rhino for a long time and look forward to expanding the plugin’s features going forward, including increased compatibility with RealityServer and integration of the latest Iray features. We have released an updated version of the plugin for Rhino 5.0 with Iray 2017.1.1, including the incredible new AI denoising functionality. You can download a free 30 day trial and purchase Iray for Rhino from irayplugins.com.
RealityServer for Onshape is now live! Following successful beta, migenius is pleased to announce that RealityServer rendering for Onshape is now available as a publicly available service. Visit the RealityServer rendering page in the Onshape appstore to subscribe and start rendering fast, photorealistic images of your Onshape models immediately.
The first two hours every month are free…
2 free hours per month are included in your subscription and you can purchase additional rendering hours as you need them. Your rendering runs on a dedicated server with a high end NVIDIA GPU and to show you how straightforward the service is to use we have created this short movie to get you started.
We are happy to announce the immediate availability of RealityServer 5.0. There are some great new features so we’ve put together a quick list of the headline items. We will also be posting additional articles on the individual features and how to use them but for now take a look at what’s new.
migenius has just won its second award in six months for the Bloom Unit rendering plugin for SketchUp. Bloom Unit was named ‘Disruptor of the Year’ at the Lighting Design Awards 2017 in London for what the judges called, “a brilliant use of real-time cloud computing”.
The Lighting Design Awards are a major event in the lighting industry’s calendar, presided over by an international panel of judges. Now in its 41st year, the ‘LDAs’ brought together shortlisted projects and products from all over the world for a gala night at the Hilton Park Lane in London’s prestigious Mayfair district.
A lot of new customers ask us where they can run RealityServer since they don’t have their own server or workstation with NVIDIA GPU hardware available. Starting up RealityServer on Nimbix is covered in another article where everything is pre-configured for you, on AWS however you need to do a bit more setup yourself. We are assuming here that you are already familiar with Amazon Web Services and starting instances on Amazon EC2, along with basic concepts like security groups. We won’t cover the basics of how to start an instance here however there is lots of good information about that online, including this guide from Amazon. So, let’s get started.