Meet migenius at NVIDIA GTC in San Jose!

Come and meet us at NVIDIA’s annual GPU Technology Conference in San Jose 26-29th March, 2018.  NVIDIA’s theme this year is “AI & Deep Learning” and we’ll be on the ‘Iray Plugins’ booth #826, where you’ll be able to see examples of Iray images rendered with the latest AI De-Noising technology. AI now accelerates de-noising of Iray images by a factor of up to 10x.

Read More

Continue Readingred arrow

Another Award for Bloom Unit

Bloom Unit, our physically accurate, photorealistic renderer for SketchUp has just won another award. The LIT Lighting Design Awards selected Bloom Unit for an award in the Innovative Lighting Design Software Applications category, making this the third lighting industry accolade to be handed out for a RealityServer based product. Behind the scenes Bloom Unit uses RealityServer as its core rendering engine and much of its functionality is inherited from this platform. It’s great to see recognition of the difference physical accuracy makes in the usefulness of rendering for this industry.

Continue Readingred arrow

What’s New in RealityServer 5.1 Update 173

We recently released RealityServer 5.1 build 2017.173. This is mainly an incremental and bug fix release but adds some cool features some customers have been waiting for, including multiple UV sets, materials and holes for the generate_mesh command, updated AssImp plugin, a new Smart Batch command and render loop improvements.

Read More

Continue Readingred arrow

Basic Canvas Operations in V8

RealityServer 5.1 introduced new functionality for working with canvases in V8. In this post I’m going to show you how to do some basic things like resizing and accessing individual canvas pixels. We’ll build a fun little command to render a scene and process the result into a piece of interactive ASCII art. Of course, this doesn’t have much practical utility but it’s a great way to learn about this new feature!

Read More

Continue Readingred arrow

RealityServer 5.1 with AI Denoising

RealityServer 5.1 is here and it has something a lot of users have been asking about. This release adds the new AI Denoising algorithm for fast and high quality denoising of your images using state of the art machine learning technology. You really need to try it to fully appreciate the performance benefits however we’ll show you a few images to give you a feeling for what it is capable of. We are also adding support for the new NVIDIA Volta architecture and as usual a range of other smaller enhancements.

Read More

Continue Readingred arrow

migenius Welcomes Iray for Rhino

Starting November 20th 2017 migenius has taken over the support and development of the Iray for Rhino plugin. We have been big fans of Rhino for a long time and look forward to expanding the plugin’s features going forward, including increased compatibility with RealityServer and integration of the latest Iray features. We have released an updated version of the plugin for Rhino 5.0 with Iray 2017.1.1, including the incredible new AI denoising functionality. You can download a free 30 day trial and purchase Iray for Rhino from

Continue Readingred arrow

Bring Your Designs to Life!

RealityServer for Onshape is now live! Following successful beta, migenius is pleased to announce that RealityServer rendering for Onshape is now available as a publicly available service. Visit the RealityServer rendering page in the Onshape appstore to subscribe and start rendering fast, photorealistic images of your Onshape models immediately.

The first two hours every month are free…

2 free hours per month are included in your subscription and you can purchase additional rendering hours as you need them. Your rendering runs on a dedicated server with a high end NVIDIA GPU and to show you how straightforward the service is to use we have created this short movie to get you started.

Happy rendering!

Continue Readingred arrow

What’s New in RealityServer 5.0

We are happy to announce the immediate availability of RealityServer 5.0. There are some great new features so we’ve put together a quick list of the headline items. We will also be posting additional articles on the individual features and how to use them but for now take a look at what’s new.

Read More

Continue Readingred arrow

RealityServer on AWS

A lot of new customers ask us where they can run RealityServer since they don’t have their own server or workstation with NVIDIA GPU hardware available. Starting up RealityServer on Nimbix is covered in another article where everything is pre-configured for you, on AWS however you need to do a bit more setup yourself. We are assuming here that you are already familiar with Amazon Web Services and starting instances on Amazon EC2, along with basic concepts like security groups. We won’t cover the basics of how to start an instance here however there is lots of good information about that online, including this guide from Amazon. So, let’s get started.

Instance Selection and Starting

Before you dive in to launch some instances on AWS to run RealityServer, you should first think about where you want to run your server (which region) as well as which instance type you want to use. While RealityServer can work on pure CPU instances to access all features and get the best performance, you should really use AWS GPU instances. There are currently three types of GPU instances available on AWS that support RealityServer well:

  • p2 – NVIDIA Tesla K80 (p2.xlarge, p2.8xlarge, p2.16xlarge)
  • g3 – NVIDIA Tesla M60 (g3.4xlarge, g3.8xlarge, g3.16xlarge)
  • p3 – NVIDIA Tesla V100 (p3.2xlarge, p3.8xlarge, p3.16xlarge)

All of the current generation instance types use Tesla GPU hardware and offer various level of performance and memory capacity. The p3 instances use the Tesla V100 GPU and at the time of writing was the fastest available. Of course Amazon offerings might change in the future. You can see our benchmark data on how these instance types perform. When selecting which region to start your node in consider how far it is likely to be from you; closer servers mean lower latency. We do not recommend previous generation instance types for RealityServer and they may not be supported in some cases.

When you start your instances, ensure you open TCP ports 8080, 8081 and 1935 in your security group, in addition to the standard SSH port. We generally use the standard Amazon Linux 2 HVM AMI (usually first in the list which launching from the console); for GPU instances you must use HVM AMIs. These instructions assume you are using the standard Amazon Linux 2 AMI; other distributions can also be used with RealityServer, but you will need to sort out the installation process yourself. In general since Amazon Linux is Red Hat Enterprise based, CentOS and RHEL should behave in a similar way.

Dependencies and Configuration

Once you have launched your instance and are ready to connect you can ssh into your instance and start setting up the GPU drivers and RealityServer. First let’s install some dependencies we will need:

sudo yum update -y
sudo yum install -y gcc make wget libX11 libGLU libSM
sudo yum install -y kernel-devel-$(uname -r) kernel-headers-$(uname -r)

If the yum update command installs a new kernel version you should reboot prior to installing the kernel headers and development package. In order to obtain your licenses for RealityServer you need to set the SPM_HOST environment variable. You can do this by hand each time you login, but to keep it persistent you can do the following:

sudo sh -c 'echo setenv SPM_HOST > /etc/profile.d/spm.csh'
sudo sh -c 'echo export > /etc/profile.d/'

One additional step is required to obtain your licenses; you need to tell SPM which port to connect to our servers on (we definitely realise this licensing process is a bit complex and are working on ways to simplify this in the future). Add the licensing port to your /etc/services file as follows (replacing [port] with the licensing port provided by migenius):

sudo sh -c 'echo -e "mi-spm\t[port]/tcp\t# migenius SPM License Server" >> /etc/services'

Make sure you use the double redirections, you don’t want to overwrite your /etc/services file! Next let’s get the NVIDIA drivers installed. You can check for the latest Linux version here. Here we used the versions that worked well at the time this was written (on the p2 instance types), substitute as needed:

sudo sh ./ -a

Follow all of the prompts, it’s pretty self-explanatory. When running RealityServer you may get warnings about ECC being enabled, unfortunately AWS do not offer the ability to disable this GPU feature, so you may get a slight performance hit and you will lose 12.5% of your GPU memory capacity. This is just due to the way AWS works and sadly not something we can fix with RealityServer. Future releases of RealityServer may require more recent CUDA versions and therefore ensure the driver you install supported the minimum level of CUDA needed.

Installing RealityServer

With the GPU drivers installed (if you want to confirm they detect the GPU just run nvidia-smi and check out the output) you can now download and install RealityServer:

sudo mkdir -p /usr/local/migenius
cd /usr/local/migenius
sudo wget
sudo tar zxvf rsws-53-2593.88.tgz
sudo chown -R ec2-user:ec2-user rsws-53-2593.88
sudo ln -s rsws-53-2593.88 rsws

Obviously replace the RealityServer link with that to the release you want to use. The last line above creates a symlink to the version you have installed, we will use this later when setting up services so you can easily switch versions later if you need to.

Test Run RealityServer

If all goes well you can now do a test run and get RealityServer going. We recommend doing this before trying to install RealityServer as a service since it will be more difficult to diagnose issues at that stage. Here are the commands to test run RealityServer, assuming you are still in the directory from above:

cd rsws
. /etc/profile.d/

You should see a lot of log output. You can then connect to RealityServer and explore the examples and documentation by going to the following address (where ec2-ip is the public IP of your AWS instance):


If you want to start RealityServer so that it keeps running after you log out you can use nohup for that:

nohup ./realityserver_ws > rs.log 2>&1 &

Here we are sending the log output to rs.log so you won’t see anything on screen. To see the log output you can just tail the file:

tail -f rs.log

When you’re done with your instance and don’t want to leave it running, you can just power it off:

sudo poweroff

On AWS this puts the instance into a stopped state and you can restart it at anytime. You don’t get charged for the instance while it’s stopped (only for the storage associated with it), so this can be a handy way to keep a RealityServer development node ready to run. Now, that things are working correctly we can install RealityServer as a system service so that it runs on startup and restarts if it fails for any reason.

Installing as a System Service

The raging debate over init.d vs systemd is now pretty much in the past so we can now recommend using systemd to manage your RealityServer service. This isn’t a complete guide to systemd but we’ll run through the basics of how to use systemd to control RealityServer. First you will need to create the service file in /etc/systemd/system/realityserver.service. We can do this in one command with:

sudo tee -a /etc/systemd/system/realityserver.service > /dev/null << EOL
ExecStart=/usr/local/migenius/rsws/linux-x86-64/bin/realityserver --config_file /usr/local/migenius/rsws/realityserver.conf $RS_ARGUMENTS
# Required on some systems
# Restart service after 5 seconds RealityServer crashes
# Output to syslog

This is a single multi-line command you can cut and paste the entire command at once. Next we need to make the configuration directory for our service:

sudo mkdir /etc/systemd/system/realityserver.service.d

This directory holds the override.conf file which allows us to setup environment variables that are used when starting RealityServer. We can add this with another multi-line command:

sudo tee -a /etc/systemd/system/realityserver.service.d/override.conf > /dev/null << EOL
Environment="RS_ARGUMENTS='--network off' '-o iray_render_mode=cuda_dynamic'"

The last line of the configuration adds the MDL system path environment variable which you will need if you have vMaterials or the MDL Material Exchange libraries installed. If you don’t use either of these you can omit this line. You’ll also want to uncomment the lines in your realityserver.conf file that reference the MDL_SYSTEM_PATH environment variable.

At this point there is a service setup and you should be able to check this with the following command:

systemctl status realityserver

You should see something like this:

● realityserver.service - RealityServer
   Loaded: loaded (/etc/systemd/system/realityserver.service; disabled; vendor preset: disabled)
  Drop-In: /etc/systemd/system/realityserver.service.d
   Active: inactive (dead)

So it’s there and everything has been found but it hasn’t yet been enabled and it isn’t running. Before we fix that we’ll setup one more thing, logging. The service is currently setup to log all of it’s output to syslog which is going to make it difficult to separate RealityServer messages from other system messages. To fix that we need to do a little rsyslogd configuration. We’ll use the following multi-line command to set this up:

sudo tee -a /etc/rsyslog.d/realityserver.conf > /dev/null << EOL
\$FileCreateMode 0644
\$template RealityServerFormat,"%rawmsg%\n"
\$template RealityServerFile,"/var/log/realityserver.log"
if \$programname == 'realityserver' then {
\$FileCreateMode 0600

This will change the configuration so that all output is now logged to /var/log/realityserver.log. If you want to allow reading of the log files as the ec2-user then you will also need to modify /etc/rsyslog.conf by adding the following lines at the top:

# Reset umask so FileCreateMode can be used instead
$umask 0000
# By default logs should only be readable by process owner
$FileCreateMode 0600

We now need to force rsyslogd to reload our configuration before we start RealityServer:

sudo systemctl restart rsyslog

Finally we are ready to enable the service so that it starts up on system boot.

sudo systemctl enable realityserver

Which should output something similar to this:

Created symlink from /etc/systemd/system/ to /etc/systemd/system/realityserver.service.

Since we haven’t rebooted the service is not yet running so we can now start it up with:

sudo systemctl start realityserver

If you tail the contents of /var/log/realityserver.log you should see the last couple of lines look something like this:

19/09/06 09:35:35   1.0   V8     main info : Started.
19/09/06 09:35:35   1.0   PLUGIN main info : Started RealityServer(R) Web Services.

Assuming you can connect to RealityServer in your browser as you did in the earlier steps then everything should be working now. You can test that the service runs on startup by rebooting the machine. Of course, don’t forget to shutdown the instance if you are not using it. You now have an instance which will automatically run RealityServer on startup using the standard systemd process. If you create an AMI of this instance you can then start more instances with the same configuration.

NVIDIA Persistence Daemon

We’ve seen some issues recently were during startup the NVIDIA driver was not initialised and loaded. This could cause issues with RealityServer startup. To ensure the driver is always loaded we recommend installing and enabling the NVIDIA Persistence Daemon. This is installed by the driver however not as a system service. You can add it as a system service in a similar way to RealityServer with the following commands.

sudo tee -a /etc/systemd/system/nvidia-persistenced.service > /dev/null << EOL
Description=NVIDIA Persistence Daemon

ExecStart=/usr/bin/nvidia-persistenced --user ec2-user
ExecStopPost=/bin/rm -rf /var/run/nvidia-persistenced

sudo systemctl enable nvidia-persistenced
sudo systemctl start nvidia-persistenced

Taking it Further

There is a lot we haven’t covered here and, in a production system, load balancing, clustering and many more topics. We don’t cover this here however don’t hesitate to contact us if you want more details on how to do these things. We also haven’t covered creating an AMI from your work here so you can start more nodes. This is essentially no different to any other Amazon instance type, so there is a lot of good information out there on that.

One note, if you want to try clustering multiple nodes on AWS, you will need to use TCP based clustering (without the UDP discovery option) because AWS does not support UDP multicast on its network.

Happy Rendering!

Continue Readingred arrow

migenius Wins Lux Award for Bloom Unit

Lux Award

Watched by almost 1,000 people at the UK lighting industry’s Lux Awards in London, migenius’ Bloom Unit rendering plugin for SketchUp won the “Enabling Technology of the Year” award ahead of a shortlist comprising some much bigger and perhaps better known names (see the full list here).

The panel of 17 judges drawn from all over the industry cited Bloom Unit as product that was a ‘step change’ in lighting design – by ‘leveraging the power of cloud computing’ migenius had delivered a product that allows renders ‘of awesome scope and accuracy’.

A feeling for just how prestigious this event is can be gained from the official movie, with migenius and their UK partner for the lighting industry, onlight, picking up their award around the two minute mark.

We are certainly proud that Bloom Unit has been recognised by the UK lighting industry. Proof indeed of just how important the RealityServer technology that underpins it is for businesses wishing to differentiate the user experience for the products and services they are bringing to market.

If you want to checkout Bloom Unit for yourself, sign up for a free 14 day trial on the Bloom Unit website.

Continue Readingred arrow

What’s New in RealityServer 4.4 Update 93

We recently released RealityServer 4.4 build 1527.93. This update included Iray 2016.2 and some interesting new features. While still an incremental update, the big item many of our customers have been asking for is finally here, NVIDIA Pascal architecture support. So your Tesla P100, Quadro P6000, Quadro P5000, GeForce GTX TITAN X, GeForce GTX 1080, 1070 and other Pascal cards will now work with RealityServer. Keep reading for some more details of the new features in update 93 of RealityServer.

Read More

Continue Readingred arrow

RealityServer on Nimbix JARVICE 2.0

The number of cloud service providers offering NVIDIA GPU resources is increasing and in today’s article we will show you how to get started using RealityServer with Nimbix. migenius has deployed several of its customer projects on the Nimbix platform and it offers some unique advantages such as containerised environments (instead of virtualisation), fast start-up times and usage charged by the minute instead of by the hour. On Nimbix migenius has set-up a pre-configured RealityServer environment for you, keep reading to learn how to sign up for Nimbix services and get RealityServer up and running. Read More

Continue Readingred arrow

3D Transformations – Part 2 SRT

In this, the second part of our article on transformations I will introduce SRT (Scaling, Rotation, Translation) transformations. Unlike the previous article, this one will have a lot less maths and shows you a simpler way to work with transformations in RealityServer. Additionally the method allows for automatic interpolation of transformations over time in a smooth way which is great for creating animations. Once things are moving you can also introduce motion blur for more realistic results. Read on to discover the ease of SRT transformations.

Read More

Continue Readingred arrow

3D Transformations – Part 1 Matrices

Transformations are fundamental to working with 3D scenes and something that can be frequently confusing to those that haven’t worked in 3D before. In this, the first of two articles I will show you how to encode 3D transformations as a single 4×4 matrix which you can then pass into the appropriate RealityServer command to position, orient and scale objects in your scene. In a second part I will dive into a newer method of specifying transformations in RealityServer called SRT transformations which also allows for the easy animation of objects.

Read More

Continue Readingred arrow

Creating Lighting Programmatically

In this article I am going to show you how add light sources to your RealityServer scene using the Web-services API. You will learn how to add several different types of lights, including a photometric light using an IES data file, an area light, a spot light and daylight. This will be a very simple example but will give you all of the pieces you need to programmatically add lighting to your scene. You can expand on the concepts shown here to make different types of lighting very easily.

Read More

Continue Readingred arrow

VR Rendering with RealityServer 4.4 Update 46

RealityServer 4.4 build 1527.46 has just been released adding Iray 2016.1.1 which includes support for rendering stereo, spherical VR imagery suitable for viewing with devices such as the Oculus Rift, HTC Vive, Samsung GearVR, OSVR and Google Cardboard viewers. There are also numerous small additions and bug fixes and some other new features such as spectral rendering, however VR rendering is the headline item. In this article we will show you how to do simple VR rendering with RealityServer.

Read More

Continue Readingred arrow

Creating a Simple Scene Programmatically

In this article I am going to show you how to create a simple 3D scene, completely from scratch using RealityServer. You will learn about the anatomy of a RealityServer scene and the different components that go into making it up, including options, groups, instances, cameras, geometry and environment lighting. While the scene will be very simple there will be many key principles of RealityServer and NVIDIA Iray demonstrated which you can expand on to build more complex scenes.

Read More

Continue Readingred arrow

Exploring the RealityServer JSON-RPC API


When getting started with RealityServer, many customers ask us the best place to begin in order to learn how RealityServer works. One of the best and most enjoyable ways we find is to explore the JSON-RPC API which remains the main way that RealityServer functionality is accessed. In this article we will provide an overview of how the RealityServer JSON-RPC API works and some of the best ways to explore and play with functionality exposed there. Whether you are new to RealityServer or a veteran user you will find some valuable pointers.

Read More

Continue Readingred arrow

What’s New in RealityServer 4.4 Update 40

RealityServer Update

Today we released RealityServer 4.4 build 1527.40. This incremental update focuses on features to help make development with RealityServer easier. It includes many elements which enable RealityServer to do more out of the box without having to write your own plugins. When new customers get their first look at RealityServer we often get many of the same questions about how to do certain things. We hope with this release and future releases to start covering many of these with off the shelf functionality. Most of the information here is also contained in the RealityServer release notes and documentation but if you don’t have RealityServer yet you can read about some of these new features below.

Read More

Continue Readingred arrow

RealityServer 4.4 with Iray 2016.0

We have just released RealityServer 4.4 which includes the new NVIDIA Iray 2016.0. We will periodically release updated versions as new improvements and Iray updates become available. We’ll cover some of the highlights of this release here but users are strongly encouraged to read both the RealityServer release notes (relnotes.txt) and the Iray releases notes (neurayrelnotes.pdf) provided with the release. Let’s take a look at those new features, some of which many of our users have been asking about for some time.

Read More

Continue Readingred arrow

RealityServer for Onshape

migenius is pleased to announced we are one of the launch partners for the Onshape App Store which has just entered private beta. We are making RealityServer available as an integrated application within Onshape running entirely in the cloud. Check out our dedicated RealityServer for Onshape product page for further details. Our integration with Onshape is still in beta at the moment however it is already very usable. You can request early access to the Onshape App Store private beta from this link. If you want to know more please contact us.

Continue Readingred arrow

Getting Started with RealityServer

So you have obtained RealityServer and installed your license server, what now? We frequently get questions about the best place to start learning about RealityServer and how to use it. As RealityServer is a large, very generalised platform it can be difficult to know where to start. This article provides some pointers on where to start and the best way to learn the basics.

Read More

Continue Readingred arrow

RealityServer 4.3

We have just released RealityServer 4.3 which includes the new NVIDIA Iray 2015. There are some great new features included and we will be adding several more to incremental releases in the future based on new additions to Iray. We’ll cover some of the highlights for you here, there are also a lot of smaller additions so we encourage users to take a look through the release notes for both RealityServer and Iray.  Here is a taste of what’s new, starting with something we have been waiting a very long time for and are very glad to say has been included in this release. Read More

Continue Readingred arrow

Making a Simple Diamond Material in MDL

It’s MDL Monday again and this week I am going to show you how to put together a simple material for simulating diamonds, including dispersion based on an Abbe number. Now, I’m not a 3D artist by any means, but MDL allows me to create a material like this based on the real physical properties of diamonds rather than trying to tune abstract parameters. This material is very simple but very useful if you need to simulate jewellery. You can build on it easily to simulate other gemstones, glasses and similar substances without much effort. Today I’m just going to start with a basic, colourless diamond and cover some concepts which are important for creating physical materials.

Read More

Continue Readingred arrow

Procedurally Retro with MDL

NVIDIA Iray 2015 will introduce some great new features, including the ability to write your own procedural functions for use in materials. This is fantastic for creating resolution independent effects which can cover large areas without noticeable tiling artifacts (unless you want them of course). Iray is built into our RealityServer product so I love to test out its latest features. To put procedural functions through their paces I decided to try to emulate something procedural from my childhood, the now famous 10 PRINT program. This little one liner, originally designed to demonstrate the capabilities of the Commodore 64 prints a maze by randomly alternating between two diagonal characters. Iray uses the NVIDIA Material Definition Language (MDL) both to define materials as well as custom functions, if you haven’t tried it out this little tutorial is a great way to get started.

Read More

Continue Readingred arrow
Get in Touch