<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet type="text/xsl" href="https://digitalproduction.com/wp-content/plugins/xslt/public/template.xsl"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	xmlns:rssFeedStyles="http://www.wordpress.org/ns/xslt#"
>

<channel>
	<title>DIGITAL PRODUCTION</title>
	<atom:link href="https://digitalproduction.com/feed/" rel="self" type="application/rss+xml" />
	<link>https://digitalproduction.com</link>
	<description>Magazine for Digital Media Production</description>
	<lastBuildDate>Wed, 13 May 2026 09:43:19 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
<site xmlns="com-wordpress:feed-additions:1">236729828</site>	<item>
		<title>Artineering Flair 1.2 for Maya hits macOS</title>
		<link>https://digitalproduction.com/2026/05/13/artineering-flair-1-2-for-maya-hits-macos/</link>
		
		<dc:creator><![CDATA[Bela Beier]]></dc:creator>
		<pubDate>Wed, 13 May 2026 09:00:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[topnews]]></category>
		<category><![CDATA[Artineering]]></category>
		<category><![CDATA[Autodesk Maya]]></category>
		<category><![CDATA[Epic Unreal Engine]]></category>
		<category><![CDATA[Maya]]></category>
		<category><![CDATA[Maya plugin]]></category>
		<category><![CDATA[Maya rendering]]></category>
		<category><![CDATA[non-photorealistic rendering]]></category>
		<category><![CDATA[real-time rendering]]></category>
		<category><![CDATA[VFX]]></category>
		<guid isPermaLink="false">https://digitalproduction.com/?p=277352</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-25-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?fit=1200%2C675&quality=72&ssl=1" width="1200" height="675" title="" alt="A digital animation workspace features a stylized illustration of two cartoonish cats cuddling, with one sitting atop the other. The scene is textured, resembling soft fabric, and is complemented by vibrant colors. A person with dark hair is visible, discussing the project in the bottom corner." /></div><div><p>macOS support lands for Maya 2024+, licensing expands to floating and perpetual, and lines get darker with negative light response. Also: Unreal beta.</p>
<p>The post <a href="https://digitalproduction.com/2026/05/13/artineering-flair-1-2-for-maya-hits-macos/">Artineering Flair 1.2 for Maya hits macOS</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-25-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?fit=1200%2C675&quality=72&ssl=1" width="1200" height="675" title="" alt="A digital animation workspace features a stylized illustration of two cartoonish cats cuddling, with one sitting atop the other. The scene is textured, resembling soft fabric, and is complemented by vibrant colors. A person with dark hair is visible, discussing the project in the bottom corner." /></div><div><p class="wp-block-paragraph"><em>For those who don’t know the tool: <a href="https://artineering.io/software/flair" title="">Flair</a> is a stylized renderer inside <a href="https://www.autodesk.com/products/maya/overview">Autodesk Maya</a>, focused on NPR looks in the viewport and rendering, with style shaders, materials, and line effects that sit upstream of comp and delivery.</em></p>
<span hidden class="__iawmlf-post-loop-links" data-iawmlf-links="[{&quot;id&quot;:14385,&quot;href&quot;:&quot;https:\/\/artineering.io\/software\/flair&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:372,&quot;href&quot;:&quot;https:\/\/www.autodesk.com\/products\/maya\/overview&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20251216222206\/https:\/\/www.autodesk.com\/products\/maya\/overview&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2025-12-27 13:52:22&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-09 12:43:49&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-20 02:39:32&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-23 14:23:16&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-27 01:50:58&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-01 14:18:25&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-05 13:05:07&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-08 17:30:25&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-13 10:36:44&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-19 01:07:31&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-24 16:39:53&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-27 22:05:33&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-06 09:59:48&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-09 10:02:36&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-12 12:56:28&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-15 15:09:08&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-18 20:30:44&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-22 05:31:12&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-25 06:15:35&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-28 08:01:15&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-31 12:22:20&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-03 13:59:37&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-06 14:03:19&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-09 14:05:13&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-12 16:57:16&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-16 05:56:04&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-19 07:14:01&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-22 07:28:46&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-25 08:32:46&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-28 09:08:37&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-01 10:09:35&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-04 14:22:31&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-07 14:51:05&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-10 15:03:18&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-13 16:56:32&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-13 16:56:32&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14386,&quot;href&quot;:&quot;https:\/\/artineering.io\/software\/flair#feature-reels&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20260216120426\/https:\/\/artineering.io\/software\/flair&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2026-05-13 09:00:48&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-13 09:00:48&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14387,&quot;href&quot;:&quot;https:\/\/share.sender.net\/campaigns\/fdHB\/-flair-12-and-beta-testing-flair-for-unreal-engine&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14388,&quot;href&quot;:&quot;https:\/\/docs.artineering.io\/flair\/release-log&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20260513090641\/https:\/\/docs.artineering.io\/flair\/release-log\/&quot;,&quot;redirect_href&quot;:&quot;https:\/\/docs.artineering.io\/flair\/release-log\/&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2026-05-13 16:19:22&quot;,&quot;http_code&quot;:206}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-13 16:19:22&quot;,&quot;http_code&quot;:206},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14389,&quot;href&quot;:&quot;https:\/\/artineering.io&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;}]"></span>


<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe class="youtube-player" width="1200" height="675" src="https://www.youtube.com/embed/OQbQdRzUxRM?version=3&rel=1&showsearch=0&showinfo=1&iv_load_policy=1&fs=1&hl=en-US&autohide=2&wmode=transparent" allowfullscreen="true" style="border:0;" sandbox="allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox"></iframe>
</div></figure>



<h3 id="macos-arrives-and-it-is-not-a-footnote" class="wp-block-heading">macOS arrives, and it is not a footnote</h3>



<p class="wp-block-paragraph">Flair 1.2 adds macOS support for Maya 2024 and newer. The supported operating systems now cover Windows, Linux (Rocky), and macOS, with macOS builds on Apple Silicon only. The setup requirements are macOS 14 as minimum and macOS 26 as recommended, alongside Windows 10 and 11 and RHEL based Linux options. Linux availability is tied to specific license tiers.</p>



<h3 id="proxies-stop-being-special-materials-and-start-being-a-toggle" class="wp-block-heading">Proxies stop being special materials and start being a toggle</h3>



<p class="wp-block-paragraph">A core workflow change in 1.2 is how proxies work. Any flairShader material can now double as a proxy with a single toggle, replacing the old ShaderFX proxy material. The new proxy behavior supports Wobble, Offsets, and multiple NoiseFX types. That matters because the proxy no longer needs to be a separate shaded thing you babysit, it becomes a mode of the same material you already use.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-01-14-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?quality=72&ssl=1"><img data-recalc-dims="1"  fetchpriority="high"  decoding="async"  width="1200"  height="675"  data-id="277355"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-01-14-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?resize=1200%2C675&quality=72&ssl=1"  alt="A digital art creation showcases a whimsical, cartoonish elephant with a patchwork design, playfully positioned with its trunk raised toward a large blue sphere. The background features a textured, off-white canvas, and a focused individual appears in a circular frame, observing the process on a computer screen."  class="wp-image-277355" ></a></figure>



<figure class="wp-block-image size-large"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-28-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?quality=72&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  data-id="277356"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-28-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?resize=1200%2C675&quality=72&ssl=1"  alt="An artistic digital rendering showcases a whimsical scene of a stylized cat and mouse intertwined on a textured background. The cat features unique patterns and hues, while the mouse displays delicate details. In the lower corner, a man, with short dark hair, speaks thoughtfully, adding a personable touch."  class="wp-image-277356" ></a></figure>
</figure>



<p class="wp-block-paragraph">This is a practical fix for hard-shell proxy intersections and visible component lines when proxy geometry overlaps other geometry. With the proxy living inside the same shader, you can apply effects like Wobble Blur to introduce diffusion and gradients rather than accepting a hard boundary.Using proxy offsets to localise line thresholds, so a region can show only depth-based silhouettes while the rest of the scene keeps different line logic. That kind of localised override can be the difference between a clean frame and a late night of per-shot hacks.</p>



<p class="wp-block-paragraph">You still want to test this carefully in production. Even when an update says it is seamless, changes to proxy behaviour can alter look dev deltas in ways that only show up once animation starts doing unpleasant things.</p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-07-21-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?quality=72&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-07-21-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?resize=1200%2C675&quality=72&ssl=1"  alt="A digital artwork depicting a whimsical robot sitting on stone blocks near a bridge, outlined in black and white. The scene is set against a dark backdrop, with the robot wearing a hat and expressive features. In the corner, a man gestures animatedly, likely explaining the piece."  class="wp-image-277360" ></a></figure>



<h3 id="lines-get-more-controllable-and-more-mischievous" class="wp-block-heading">Lines get more controllable and more mischievous</h3>



<p class="wp-block-paragraph"><a href="https://artineering.io/software/flair#feature-reels" title="">Flair 1.2</a> adds a set of line features that target a classic pain point: outlines and inlines are not the same thing, so stop treating them as if they are. A new Canvas Override global attribute can replace the beauty pass with the canvas colour. In practice, this lets you inspect what Flair effects and line work contribute, without the underlying shaded colours distracting you. The attribute appears in the globals documentation and in the release notes as a new global control.</p>



<p class="wp-block-paragraph">For outlines specifically, the update adds Outline Width Offset, letting you modify line width for outlines relative to inlines. In the video, this becomes a look-dev lever for graphic styles in which the silhouette reads as bold while interior lines stay subtle, or where outlines disappear entirely, leaving only internal structure.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="450"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/baker_noir_geo_blur.jpg?resize=1200%2C450&quality=80&ssl=1"  alt="https://cdn.sender.net/email_images/173419/images/22866/baker_noir_geo_blur.jpg"  class="wp-image-277371" ><figcaption class="wp-element-caption"><em>Baker and the Bridge scene by Conrad Justin (CC BY 4.0) – Stylized with Flair for Maya</em></figcaption></figure>



<p class="wp-block-paragraph">Light response also gets split. There are new controls over Outline and Inline Light Response for both clean and rough lines. That separation enables a sharper art direction trick: negative light response. Negative light response shows lines in shaded parts of the scene, which the update calls out as a “noir-style” option. In the demo, that produces lines where you would otherwise get pure darkness. It is a neat tool for stylized lighting setups where you want form to read in the shadows without repainting the entire grade.</p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-05-09-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?quality=72&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-05-09-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?resize=1200%2C675&quality=72&ssl=1"  alt="In a digital environment, a stylized figure stands confidently, sketched in black lines with subtle shading, wrapped tightly with cords, creating a sense of tension. In the bottom right corner, a person, appearing surprised, engages with the scene, their facial expression suggesting concentration."  class="wp-image-277361" ></a></figure>



<p class="wp-block-paragraph">Depth based sketchiness control arrives too, via Sketchiness Depth Range and Sketchiness Depth Factor. This lets sketchiness increase or decrease along scene depth, and it is presented as a global control that works across scenes, rather than relying only on placing proxy planes at different distances. For facilities that build reusable style presets across sequences, this is the sort of knob that can save time when layout changes the scale of a set and your line breakup suddenly looks wrong.</p>



<h3 id="licensing-grows-up-subscription-plus-options-studios-actually-ask-for" class="wp-block-heading">Licensing grows up: subscription, plus options studios actually ask for</h3>



<p class="wp-block-paragraph">Flair 1.2 expands licensing beyond subscription to include perpetual and floating licenses, and it introduces a licensing server to manage floating seats. Online licenses can be migrated to a new machine six hours after the previous activation. The release log also calls out fixes for activation on new Windows machines where wmic is missing, and an installer fix when an older install folder path no longer exists.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  width="804"  height="354"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/activation_server.png?resize=804%2C354&quality=72&ssl=1"  alt="https://docs.artineering.io/media/setup/activation_server.png"  class="wp-image-277370" ></figure>



<p class="wp-block-paragraph">Activation for floating licensing occurs by entering a hostname or IP address for the license server, after which the client fetches available licenses and activates the first one that is free. That maps cleanly to typical facility patterns where artists should not care which workstation holds a seat today, only whether the pool has one left.</p>



<h3 id="performance-and-saving-fewer-reasons-to-stare-at-a-frozen-ui" class="wp-block-heading">Performance and saving: fewer reasons to stare at a frozen UI</h3>



<p class="wp-block-paragraph">The Sequence Renderer gets asynchronous saving of images to speed up rendering, with image writes moved to a separate process so they do not interfere with rendering. It also gains continuous improvement for anti-aliasing beyond 32 TAA samples up to 254 samples, and the camera list can now be unlimited.</p>



<figure class="wp-block-embed is-type-rich is-provider-embed-handler wp-block-embed-embed-handler"><div class="wp-block-embed__wrapper">
<div style="width: 640px;" class="wp-video"><video class="wp-video-shortcode" id="video-277352-1" width="640" height="360" preload="metadata" controls="controls"><source type="video/mp4" src="https://repo.artineering.io/videos/flair/real-time.mp4?_=1" /><a href="https://repo.artineering.io/videos/flair/real-time.mp4">https://repo.artineering.io/videos/flair/real-time.mp4</a></video></div>
</div></figure>



<p class="wp-block-paragraph">Material handling gets several workflow improvements. Keyed attributes in flairShader materials now work in Parallel and Serial evaluation modes. All material attributes appear in the Channel Box and the Attribute Spreadsheet. There are fixes for a GPU memory leak tied to assigning materials to components while scrubbing the timeline, and several other fixes around AOV blending, vertex baking, and material conversion.</p>



<p class="wp-block-paragraph">There is also a new “Sanitize Flair” button in the toolbox that unloads the plugin and tries to remove all traces of Flair from an open scene, with a confirmation dialog and no undo. That is the sort of scorched earth tool that you hope you never need, until the day you really need it.</p>



<figure class="wp-block-image is-resized"><img data-recalc-dims="1"  decoding="async"  width="600"  height="337"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/flair_in_unreal.gif?resize=600%2C337&ssl=1"  alt="https://cdn.sender.net/email_images/173419/images/22866/flair_in_unreal.gif"  class="wp-image-277372"  style="width:700px;height:auto" ><figcaption class="wp-element-caption"><em>A-COM Animation Sample by Agora Studio – Stylized with Flair for Unreal Engine</em></figcaption></figure>



<h3 id="unreal-engine-enters-the-chat-via-beta-testing" class="wp-block-heading">Unreal Engine enters the chat, via beta testing</h3>



<p class="wp-block-paragraph">Alongside the Maya 1.2 release, the <a href="https://share.sender.net/campaigns/fdHB/-flair-12-and-beta-testing-flair-for-unreal-engine">newsletter </a>announces beta testing for Flair for Unreal Engine. The team received an Epic MegaGrant last year and that it led to a major roadmap overhaul and a late Maya update, while enabling focus on bringing Flair to Unreal Engine.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/6981ad2fde12ef8bbcabd584_202520update20-20card.png?resize=1200%2C675&quality=72&ssl=1"  alt="https://cdn.prod.website-files.com/62601097610537943c391d85/6981ad2fde12ef8bbcabd584_2025%20update%20-%20card.png"  class="wp-image-277369" ></figure>



<p class="wp-block-paragraph">The Unreal version is not nearly as feature-rich as the Maya version yet, but it is completely modular. The modular approach is a way to mix and match different effects to create a custom style. Beta testers are recruited by replying to the email, with testing planned for April and a questionnaire at the end of the month to help prioritize features and guide future development. If you live in a pipeline where <a href="https://digitalproduction.com/tag/unreal/" title="Unreal">Unreal Engine</a> sits in previs, virtual production, or realtime look dev, a modular stylized renderer could be very useful. It could also be a moving target, because beta testing exists for a reason. </p>



<p class="wp-block-paragraph">One last reminder before you install anything on the main machine: validate look changes on a copy of your scene library, because even small line and proxy changes can ripple through a show. And yes, the nori lines look great in a demo, but your shots will still find a way to be difficult.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  width="600"  height="300"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/banner_1_2.gif?resize=600%2C300&ssl=1"  alt="https://cdn.sender.net/email_images/173419/images/22866/banner_1_2.gif"  class="wp-image-277373" ></figure>



<p class="wp-block-paragraph"><br /><a href="https://docs.artineering.io/flair/release-log/" title="">https://docs.artineering.io/flair/release-log/</a><br /><br /><a href="https://artineering.io/" title="">https://artineering.io/</a></p><p>The post <a href="https://digitalproduction.com/2026/05/13/artineering-flair-1-2-for-maya-hits-macos/">Artineering Flair 1.2 for Maya hits macOS</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></content:encoded>
					
		
		<enclosure url="https://repo.artineering.io/videos/flair/real-time.mp4" length="926442" type="video/mp4" />

		<enclosure url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-25-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?fit=1920%2C1080&#038;quality=72&#038;ssl=1" length="761232" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-25-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?fit=1200%2C675&#038;quality=72&#038;ssl=1" width="1200" height="675" medium="image" type="image/jpeg">
	<media:copyright>DIGITAL PRODUCTION</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[A digital animation workspace features a stylized illustration of two cartoonish cats cuddling, with one sitting atop the other. The scene is textured, resembling soft fabric, and is complemented by vibrant colors. A person with dark hair is visible, discussing the project in the bottom corner.]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/oqbqdrzuxrm-00-02-25-20-flair-for-maya-12-new-features-explained-stylized-rendering.png?fit=1200%2C675&#038;quality=72&#038;ssl=1" width="1200" height="675" />
<post-id xmlns="com-wordpress:feed-additions:1">277352</post-id>	</item>
		<item>
		<title>SpyderPro learns new tricks: where colour goes wrong</title>
		<link>https://digitalproduction.com/2026/05/13/spyderpro-learns-new-tricks-where-colour-goes-wrong/</link>
		
		<dc:creator><![CDATA[Bela Beier]]></dc:creator>
		<pubDate>Wed, 13 May 2026 06:00:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[3D LUT export]]></category>
		<category><![CDATA[ambient light measurement]]></category>
		<category><![CDATA[C2PA]]></category>
		<category><![CDATA[color management]]></category>
		<category><![CDATA[ColorReader]]></category>
		<category><![CDATA[content credentials]]></category>
		<category><![CDATA[custom display simulation]]></category>
		<category><![CDATA[Datacolor]]></category>
		<category><![CDATA[Device Preview Plus]]></category>
		<category><![CDATA[display calibration]]></category>
		<category><![CDATA[HDR monitoring]]></category>
		<category><![CDATA[Heath Barber]]></category>
		<category><![CDATA[ICC profiles]]></category>
		<category><![CDATA[imaging workflow]]></category>
		<category><![CDATA[LightColor Meter]]></category>
		<category><![CDATA[monitor calibration]]></category>
		<category><![CDATA[post production]]></category>
		<category><![CDATA[provenance metadata]]></category>
		<category><![CDATA[SpyderPro]]></category>
		<category><![CDATA[subscribers]]></category>
		<category><![CDATA[video workflow]]></category>
		<guid isPermaLink="false">https://digitalproduction.com/?p=272840</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/2_landscapevignette_244241_spyderpro-2000x1335-1-1536x1025-1.jpg?fit=1200%2C801&quality=80&ssl=1" width="1200" height="801" title="" alt="A sleek black desk set against a textured wall showcases a vibrant collage of travel photographs, including iconic landmarks like the Golden Gate Bridge. Two computer screens display scenic landscapes, while office supplies and a camera rest on the desk, adding to the creative workspace atmosphere." /></div><div><p>Datacolor is pushing beyond display profiling with Device Preview Plus, C2PA tracking, and a software layer that wants to help you with other duties as well.</p>
<p>The post <a href="https://digitalproduction.com/2026/05/13/spyderpro-learns-new-tricks-where-colour-goes-wrong/">SpyderPro learns new tricks: where colour goes wrong</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/2_landscapevignette_244241_spyderpro-2000x1335-1-1536x1025-1.jpg?fit=1200%2C801&quality=80&ssl=1" width="1200" height="801" title="" alt="A sleek black desk set against a textured wall showcases a vibrant collage of travel photographs, including iconic landmarks like the Golden Gate Bridge. Two computer screens display scenic landscapes, while office supplies and a camera rest on the desk, adding to the creative workspace atmosphere." /></div><div><p class="wp-block-paragraph">What exactly is Device Preview Plus supposed to do once the calibration itself is finished? And how far can a monitor tool stretch before it starts trying to become part of the wider image pipeline? In this interview, Heath Barber speaks about Datacolor’s current direction for SpyderPro, including display simulation, custom device profiles, C2PA content credentials, and the role of calibration in workflows that no longer end at a single screen.<br /></p>
<span hidden class="__iawmlf-post-loop-links" data-iawmlf-links="[{&quot;id&quot;:14209,&quot;href&quot;:&quot;https:\/\/www.linkedin.com\/in\/heathbarber&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;}]"></span>

<div class="wp-block-image">
<figure class="alignleft size-full is-resized"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/1516314837537-1.jpg?quality=80&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="400"  height="400"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/1516314837537-1.jpg?resize=400%2C400&quality=80&ssl=1"  alt="A close-up portrait of a man with short, light brown hair, wearing a black polo shirt. He has a serious expression, looking directly at the camera, against a dark background that emphasizes his calm demeanor and focused gaze."  class="wp-image-272842"  style="width:176px;height:auto" ></a></figure>
</div>


<p class="wp-block-paragraph">Heath Barber (<a href="https://www.linkedin.com/in/heathbarber/">LinkedIn</a>) is Senior Product Director, Consumer Solutions at Datacolor and has spent nearly 21 years with the company in a series of senior software, product and imaging roles. His previous positions including Senior Director, Software Technology; Global Market & Technology Manager, Imaging; Global Software Technology Manager; and Director of Development, Imaging Color Solutions. That long run across Datacolor’s software and imaging side makes him a useful guide for a conversation.</p>



<h3 id="from-monitor-calibration-to-device-simulation" class="wp-block-heading">From monitor calibration to device simulation</h3>



<p class="wp-block-paragraph"><strong>DP: SpyderPro now includes Device Preview. On the box, that sounds simple enough. In practice, what does it do?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Device Preview is really a way of simulating how your image or media might look on other output devices without you having to physically move the file across ten different screens and start guessing what changed where.</p>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="cZRUCLSGkTuPmVqoIjmGb8B0lpFglHO1rfNr57gVQuLxBOohIisw19JSev8MWEFAhXYiDaZHkAnQ4tK5c0C4PU3b"><blockquote class="wp-embedded-content" data-secret="6APfoQbTFv"><a href="https://digitalproduction.com/2025/11/21/crawly-for-videoscreens-spyderpro/">Crawly for Videoscreens: SpyderPro</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“Crawly for Videoscreens: SpyderPro” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/11/21/crawly-for-videoscreens-spyderpro/embed/#?secret=Fbn2H97Dot#?secret=6APfoQbTFv" data-secret="6APfoQbTFv" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph">The classic comparison is print preview. People in print workflows have done this for years. You work on one display, but you still want a reasonable idea of how the final output will behave on paper. Device Preview takes that idea and extends it to screens. Instead of asking only, “How will this print?”, we are asking, “How will this look on a phone, on a tablet, on another class of display, or on another type of viewing device?”</p>



<p class="wp-block-paragraph">That matters more than people sometimes admit. A lot of media today is first consumed on a phone. My wife is a wedding photographer, and like many photographers, one of the first things she wants to understand is how an image will look when a client opens it on a mobile device. That first impression often happens on an iPhone, an iPad or some Samsung tablet, not on a calibrated studio monitor. So what we did was build custom profiles that simulate those device classes as closely as possible on your own screen.</p>



<p class="wp-block-paragraph">In other words, you are still looking at your calibrated monitor, but the software is applying a transformation that approximates how the image would appear on another device. It is not magic, and it is not the same as physically holding that exact phone in your hand, but it gives you a much more informed preview than simply hoping for the best.</p>



<h3 id="custom-targets-and-icc-profiles" class="wp-block-heading">Custom targets and ICC profiles</h3>



<p class="wp-block-paragraph"><strong>DP: There are already quite a few screen presets in there. Can users add their own?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, and that is where things get interesting. If you actually have access to the screen you want to simulate, you can create your own profile for it. The underlying idea here is the ICC profile, which is basically a standardised description of how a device represents colour.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.datacolor.com/spyder/wp-content/uploads/2025/10/computer.jpg?w=1200&quality=80&ssl=1"  alt="https://www.datacolor.com/spyder/wp-content/uploads/2025/10/computer.jpg" ></figure>



<p class="wp-block-paragraph">If you calibrate that display with SpyderPro, create an ICC profile for it, and place that profile where Device Preview expects it, the software can use it as another target. So you are not limited to the built-in presets forever. You can extend the system yourself if you have the hardware and you are willing to profile it properly.</p>



<p class="wp-block-paragraph">That is actually a useful kind of crowdsourcing. It means users can build simulation targets for screens we may not have added officially yet. Otherwise, the alternative is simply waiting for us to issue an update with more presets. This way, if you need a very specific display and you have access to it, you are not stuck.</p>



<p class="wp-block-paragraph">In practice, this matters because not all displays fail in the same way. Some are cooler, some warmer, some over-saturate, some clip shadow detail, some compress highlights in ugly ways. An ICC profile gives you a measured description of that behaviour. Once you have that, you can simulate it much more intelligently.</p>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="VVAtYDxbsTu3PS4slQXhMUXEiZnqNKSHKpOtkrT8QEoLap70f5GIe96fwC3cMdzdYHJjz2NyvGgFj2eWhUA"><blockquote class="wp-embedded-content" data-secret="uWDuw9Opgp"><a href="https://digitalproduction.com/2025/07/14/for-those-who-dare/">„For Those Who Dare“</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“„For Those Who Dare“” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/07/14/for-those-who-dare/embed/#?secret=SztSTO4aLs#?secret=uWDuw9Opgp" data-secret="uWDuw9Opgp" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph"><strong>DP: Can Device Preview also be useful in a more classic post-production or grading pipeline? For example, can it simulate ACES or broadcast-style viewing targets?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, absolutely. Some of those things are already there in one form or another (RAW, JPG, PNG, TIFF, obviously, which is common for screenshots and frame exports), and one of the goals of Device Preview Plus is to support a much broader set of output types and colour spaces than the earlier beta did.</p>



<p class="wp-block-paragraph">That is important because once you move beyond stills and basic display calibration, you run straight into colour-management questions. ACES, for example, is not just a buzzword people throw around to sound serious. It is a standardised colour pipeline used in film and post to help move images consistently between cameras, applications, VFX, grading and final delivery. Broadcast standards do something similar in their own context. They define how media should behave so that a signal is interpreted predictably.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.datacolor.com/spyder/wp-content/uploads/2025/11/Main_AdobeRGB-1200x1200-1.png?w=1200&quality=72&ssl=1"  alt="https://www.datacolor.com/spyder/wp-content/uploads/2025/11/Main_AdobeRGB-1200x1200-1.png" ></figure>



<p class="wp-block-paragraph">What Device Preview Plus is trying to do is give users a way to preview those kinds of transformations and output conditions within a simpler interface. It is not trying to replace a full finishing pipeline, but it does let you think more concretely about where your image is going and how it might look once it gets there.</p>



<p class="wp-block-paragraph">That matters because color correction is never done in a vacuum. You are always correcting toward some destination, even if people do not always say it that way.</p>



<p class="wp-block-paragraph"><strong>DP: So in the long run, could this also help compare different source looks or make camera matching easier before you are deep into grading?</strong></p>



<p class="wp-block-paragraph">Heath Barber: That is part of the bigger ambition, yes. I would phrase it less as “this replaces grading” and more as “this gives you more context earlier in the workflow.” A lot of users do not only want calibration. They want to understand what their material is going to do when it moves from capture through editing and eventually toward delivery. So one goal of this relaunch was to build a stronger foundation for that kind of ecosystem thinking.</p>



<p class="wp-block-paragraph">Part of that was hardware support, including better support for multiple monitors. Part of it was product structure, so users can start at one level and move upward without feeling that they bought into a dead end. But the big conceptual shift was image processing. That is what starts moving SpyderPro out of the little utility corner and into actual day-to-day workflow relevance.</p>



<p class="wp-block-paragraph">If you can preview, compare and process images more meaningfully before the very end of the chain, that is valuable. People do not always need a giant facility pipeline to benefit from that. A photographer, editor, videographer or small studio can still gain a lot by seeing likely output behavior earlier and more clearly.</p>



<h3 id="what-c2pa-actually-is" class="wp-block-heading"><strong>What C2PA actually is</strong></h3>



<p class="wp-block-paragraph"><strong>DP: Let’s say somebody changes an image in the pipeline and then blames someone else. How do I know who broke it?</strong></p>



<p class="wp-block-paragraph">Heath Barber: That is where C2PA comes into the conversation. C2PA is essentially a standard for content credentials, which has been driven by Adobe and a few other companies, including Meta and Google. The simplest way to explain it is that it works like a verified provenance layer for media, which is supplied by approved partners and products. It is a way of attaching information to an image, video or audio file so that the history of that asset can be preserved and inspected.</p>



<figure class="wp-block-embed is-type-rich is-provider-embed-handler wp-block-embed-embed-handler"><div class="wp-block-embed__wrapper">
<div style="width: 640px;" class="wp-video"><video class="wp-video-shortcode" id="video-272840-2" width="640" height="360" preload="metadata" controls="controls"><source type="video/mp4" src="https://www.datacolor.com/spyder/wp-content/uploads/2025/11/penguins.mp4?_=2" /><a href="https://www.datacolor.com/spyder/wp-content/uploads/2025/11/penguins.mp4">https://www.datacolor.com/spyder/wp-content/uploads/2025/11/penguins.mp4</a></video></div>
</div></figure>



<p class="wp-block-paragraph">Now, it is important not to oversell it. This is not an unbreakable security vault. It is not meant to function like heavy-duty encryption where the whole system collapses if one person touches the file. The goal is different. The goal is to create a reliable chain of information around authorship, origin and edits, so that platforms, applications and viewers can read that history and present it meaningfully.</p>



<p class="wp-block-paragraph">If somebody strips that information out, that itself can become part of the story. So the point is not only to preserve history, but also to make tampering or removal more visible. Important to note is: Not everybody can supply the data, because a “C2PA” approval stamp needs a verified supplier to integrate it. Stripping the data away is easy, as anybody who has had an intern delete sidecar files before knows. </p>



<p class="wp-block-paragraph">But adding that information requires a vetted, approved software company, like Adobe or Datacolor. Quite a few other big names in our space are currently in the process of that verification, by the way – we don’t know if this system will be the one that solves the problems, but as far as we can guess today, it might stick.</p>



<p class="wp-block-paragraph">From our perspective, that becomes interesting because we sit so early in the image workflow. Calibration and device preview happen near the beginning of a chain. If you can begin attaching useful metadata there, and if that metadata continues downstream, then the image starts carrying some of its own history with it.</p>



<h3 id="tracking-edits-through-the-chain" class="wp-block-heading"><strong>Tracking edits through the chain</strong></h3>



<p class="wp-block-paragraph"><strong>DP: So if somebody in a studio changes an image and sends it on, that history could travel with the file?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, that is the idea, assuming the applications in the chain support reading, preserving and writing that metadata. If a file moves into Photoshop, for example, and Photoshop adds edits while preserving the content credentials, then a later viewer or platform that supports C2PA can show that there was an earlier version, then another change, then another change, and so on. That is why I often describe it more like versioning or historical tracking than some kind of hard lock. You are building a timeline of the asset.</p>



<p class="wp-block-paragraph">And that is valuable because a surprising amount of confusion in media workflows is not about whether an image was changed. Of course it was changed. The question is how, where, by whom, and in which order. That is where provenance becomes useful. It turns vague suspicion into something much more inspectable.</p>



<p class="wp-block-paragraph">Why that matters for media professionals is obvious. If you are a photographer, editor, colorist or VFX artist, so-called technical metadata is not boring background noise. It is often the only trail of breadcrumbs you get when something looks wrong and you need to work backwards.</p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.datacolor.com/spyder/wp-content/uploads/elementor/thumbs/Authenticity_Photojournalism--reb0ng49h7lnwpb66qv0gjqjhu55d0hcx84ovilisw.jpg?w=1200&quality=80&ssl=1"  alt="https://www.datacolor.com/spyder/wp-content/uploads/elementor/thumbs/Authenticity_Photojournalism--reb0ng49h7lnwpb66qv0gjqjhu55d0hcx84ovilisw.jpg" ></figure>



<p class="wp-block-paragraph"><strong>DP: Could C2PA become a foundation for more reliable metadata in professional media workflows?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, I think it could. Right now, the public conversation around C2PA is often dominated by AI detection, content authenticity and platform labeling. Those are real issues, but for media professionals there is another side to it, and that side is workflow visibility.</p>



<p class="wp-block-paragraph">A lot of tools and viewers today do not really surface the kind of information a colorist, editor or photographer would care about most. To them, that data may just look like clutter. To us, it can be gold. It can tell you what settings were used, what was embedded, what transformations were applied, and how the file evolved.</p>



<p class="wp-block-paragraph">That is why I think there is a major opportunity here that goes beyond headlines about “real versus fake.” For working professionals, the value may be that you can finally see what happened to an asset in a way that is consistent and transportable.</p>



<h3 id="camera-metadata-edit-history-and-proof" class="wp-block-heading">Camera metadata, edit history and proof</h3>



<p class="wp-block-paragraph"><strong>DP: I assume camera makers already contribute a lot of that information.</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, they do. A camera can capture all kinds of useful metadata: model, lens, focal length, exposure settings, stop values, capture conditions and more. That information is already part of the professional world. The big question is not whether metadata exists. It does. The question is whether that metadata survives and whether it remains visible and trustworthy as the file moves through the pipeline.</p>



<p class="wp-block-paragraph">If you can preserve camera-side information, then add edit-side information, then preserve output and transformation information, suddenly you have something much more powerful than isolated metadata fragments. You have a chain.</p>



<p class="wp-block-paragraph"><strong>DP: So, for a professional production, that could eventually help prove what is captured, what is altered, and what is generated?</strong></p>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="6b3UGNPwYrWouIqnOH4hipfS150RlxEz8mLCjFZBvKeTck27QVJ"><blockquote class="wp-embedded-content" data-secret="SwBkKA6C8s"><a href="https://digitalproduction.com/2025/09/09/datacolors-spyderexpress-colour-calibration-on-a-budget-now-with-growth-path-options/">Datacolor’s SpyderExpress: Colour Calibration on a Budget, Now with Growth Path Options</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“Datacolor’s SpyderExpress: Colour Calibration on a Budget, Now with Growth Path Options” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/09/09/datacolors-spyderexpress-colour-calibration-on-a-budget-now-with-growth-path-options/embed/#?secret=LhnnEXjMgS#?secret=SwBkKA6C8s" data-secret="SwBkKA6C8s" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph">Heath Barber: That is certainly one of the ambitions around this whole space. It is tricky because media workflows are messy, and the real world is rarely as neat as standards documents would like it to be. But yes, the direction is toward being able to say: this is where the asset originated, this is what was done to it, and this is what remains attributable, and it is traceable. That is useful not only for protecting creators, but also for protecting the workflow itself. When something goes wrong, being able to inspect the chain matters.</p>



<p class="wp-block-paragraph"><strong>DP: Ideally, you would like every screen in the workflow to have a little Spyder hanging from it.</strong></p>



<p class="wp-block-paragraph">Heath Barber: Sure, but the larger point is that I want to push us more into software. Spyder, as a hardware product, is well established. People know what a Spyder does, and the current platform supports most of the display technology our customers are working with today and what we see coming in the next several years. That is exactly what frees us to put our energy into software, because that is where we can deliver the most value the fastest. Workflow integration, format support, analysis tools, environmental adaptation, and content credentials. Those are software problems, and software is how we solve them best.</p>



<p class="wp-block-paragraph">So Spyder remains the foundation, but the surrounding software layer is where we can build more capability and more relevance. Things like device simulation, image processing, and metadata let us move beyond “put sensor on screen, click calibrate, done.” That workflow still matters, but it does not have to be the whole story. And honestly, we are not done innovating on display calibration or colour-critical display analysis. </p>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.datacolor.com/spyder/wp-content/uploads/2024/08/spyderpro-feature-1.jpg?w=1200&quality=80&ssl=1"  alt="https://www.datacolor.com/spyder/wp-content/uploads/2024/08/spyderpro-feature-1.jpg" ></figure>



<p class="wp-block-paragraph"><strong>DP: You recently updated the Spyder software. What is next?</strong></p>



<p class="wp-block-paragraph">Heath Barber: There is a lot in the pipeline, but the part I can talk about today is the ecosystem side. We are working closely with key vendors across both video and photography to make sure the experience is as seamless as possible from capture through delivery. The goal is to give creators the critical tools they need to do their best work and to genuinely improve their productivity, not to add another piece of software they have to babysit.</p>



<p class="wp-block-paragraph">Imaging is a team sport. People move between cameras, software, displays, tablets, phones, edit systems, grading tools, review tools, and delivery platforms all the time. So if you want to be relevant, you cannot act like calibration lives on its own little island. It has to connect into the broader ecosystem. That does not mean every user needs a huge enterprise pipeline. It does mean the tools should feel aware of how people actually work.</p>



<p class="wp-block-paragraph">The market has also changed. A lot of users today are not interested in spending endless hours tweaking every parameter inside complicated full-featured suites that are harder to use than the editing and DCC tools they are already trying to keep up with. They want tools that do most of the job well, clearly and efficiently. That is not laziness. That is reality. So part of the job is figuring out where in the workflow we can add value without making the tool feel like homework.</p>



<p class="wp-block-paragraph"></p>



<figure class="wp-block-image"><img data-recalc-dims="1" height="1186" width="1200"  decoding="async"  src="https://i0.wp.com/www.datacolor.com/spyder/wp-content/uploads/2025/11/9.-Datacolor_LightColor-Meter_Smartphone-on-flat-surface-with-device-1536x1518.jpg?resize=1200%2C1186&quality=80&ssl=1"  alt="https://www.datacolor.com/spyder/wp-content/uploads/2025/11/9.-Datacolor_LightColor-Meter_Smartphone-on-flat-surface-with-device-1536x1518.jpg" ></figure>



<h3 id="light-meter-integration-and-the-viewing-environment" class="wp-block-heading">Light-meter integration and the viewing environment</h3>



<p class="wp-block-paragraph"><strong>DP: What about the light meter side? Is that becoming a bigger part of the system too?</strong></p>



<p class="wp-block-paragraph">Heath Barber: Yes, very much so. That is one of the more exciting areas for us. Built-in ambient light sensing is useful up to a point, but it is broad and limited. A dedicated light meter is much more accurate, and once you have that, you can begin describing the viewing environment in a much more meaningful way.</p>



<p class="wp-block-paragraph">Right now, we already think in terms of multiple zones around the viewing setup. What is the light behind the screen, what is the light facing the viewer, what is happening around the workspace. The current system uses that information in a relatively constrained way, but the longer-term plan is to expand that into a much richer map of the environment, because your monitor does not exist in a vacuum either. The room changes what you perceive. Ambient light, direction, intensity and colour temperature all influence how you see the image. </p>



<p class="wp-block-paragraph">So if you can measure the environment more accurately, you can calibrate more intelligently and also preview output conditions more realistically. In other words, better environmental data improves both the monitor correction side and the device preview side.</p>



<h3 id="from-three-zones-to-room-mapping" class="wp-block-heading">From three zones to room mapping</h3>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="7AO1KlwE84fp0FB9a3RHjLUgP6oSkGCZMdyzbmYntQiervIqhTWVN5cuJX2sx"><blockquote class="wp-embedded-content" data-secret="ZyQ8u0LMgh"><a href="https://digitalproduction.com/2025/05/26/getting-colors-right-with-the-datacolor-lightcolor-meter/">Getting Colors Right with the Datacolor LightColor Meter</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“Getting Colors Right with the Datacolor LightColor Meter” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/05/26/getting-colors-right-with-the-datacolor-lightcolor-meter/embed/#?secret=OLk5kVUh2Z#?secret=ZyQ8u0LMgh" data-secret="ZyQ8u0LMgh" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph"><strong>DP: So eventually, instead of just one or two measurements, you could map the room.</strong></p>



<p class="wp-block-paragraph">Heath Barber: Exactly. Think of it less as a single reading and more as building a spatial picture of the environment. If you know what the room is doing around the display, then you can make much smarter decisions.</p>



<p class="wp-block-paragraph">And that leads to some interesting possibilities. If you have multiple measuring points, or eventually multiple devices monitoring the environment, you can start blending those readings into a more complete model. That could matter for studio setups, mobile work or any workflow where lighting conditions are inconsistent.</p>



<p class="wp-block-paragraph">Once you start seeing it that way, calibration becomes less about a single screen measurement and more about understanding the conditions in which that screen is actually being used.</p>



<p class="wp-block-paragraph"><strong>DP: So the real shift here is that calibration is no longer treated as an isolated task.</strong></p>



<p class="wp-block-paragraph">Heath Barber: Exactly. That is really the core of it. Calibration still matters, obviously. It is the foundation. But once that foundation is in place, the next logical step is to ask what else can be built on top of it. That is where device simulation, image processing, metadata, content credentials and environmental measurement all come in. We are trying to move from a narrow utility toward something that participates more fully in the image workflow.</p>



<p class="wp-block-paragraph">That does not mean it suddenly becomes a complete post-production platform. It does mean we are trying to make it more useful to the way people actually create, review and deliver media today.</p><p>The post <a href="https://digitalproduction.com/2026/05/13/spyderpro-learns-new-tricks-where-colour-goes-wrong/">SpyderPro learns new tricks: where colour goes wrong</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></content:encoded>
					
		
		<enclosure url="https://www.datacolor.com/spyder/wp-content/uploads/2025/11/penguins.mp4" length="1081954" type="video/mp4" />

		<enclosure url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/2_landscapevignette_244241_spyderpro-2000x1335-1-1536x1025-1.jpg?fit=1536%2C1025&#038;quality=80&#038;ssl=1" length="180438" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/2_landscapevignette_244241_spyderpro-2000x1335-1-1536x1025-1.jpg?fit=1200%2C801&#038;quality=80&#038;ssl=1" width="1200" height="801" medium="image" type="image/jpeg">
	<media:copyright>DIGITAL PRODUCTION</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[A sleek black desk set against a textured wall showcases a vibrant collage of travel photographs, including iconic landmarks like the Golden Gate Bridge. Two computer screens display scenic landscapes, while office supplies and a camera rest on the desk, adding to the creative workspace atmosphere.]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/04/2_landscapevignette_244241_spyderpro-2000x1335-1-1536x1025-1.jpg?fit=1200%2C801&#038;quality=80&#038;ssl=1" width="1200" height="801" />
<post-id xmlns="com-wordpress:feed-additions:1">272840</post-id>	</item>
		<item>
		<title>Foundry expands Nuke Stage for LED walls</title>
		<link>https://digitalproduction.com/2026/05/13/foundry-expands-nuke-stage-for-led-walls/</link>
		
		<dc:creator><![CDATA[Bela Beier]]></dc:creator>
		<pubDate>Wed, 13 May 2026 05:00:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[Compositing]]></category>
		<category><![CDATA[Foundry]]></category>
		<category><![CDATA[ICVFX]]></category>
		<category><![CDATA[LED walls]]></category>
		<category><![CDATA[Nuke]]></category>
		<category><![CDATA[Nuke Stage]]></category>
		<category><![CDATA[OpenColorIO]]></category>
		<category><![CDATA[OpenEXR]]></category>
		<category><![CDATA[OpenUSD]]></category>
		<category><![CDATA[subscribers]]></category>
		<category><![CDATA[virtual production]]></category>
		<guid isPermaLink="false">https://digitalproduction.com/?p=277575</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/playback-v22x.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A striking orange motorcycle stands prominently in a dimly lit urban environment, surrounded by an atmosphere of neon-lit streets projected onto the walls. A filming camera is positioned nearby, capturing the scene, creating a dynamic blend of technology and artistry." /></div><div><p>Nuke Stage now plays NotchLC, handles Gaussian Splats, logs on-set metadata in a Vault, and pushes USD scene edits into real time LED wall playback.</p>
<p>The post <a href="https://digitalproduction.com/2026/05/13/foundry-expands-nuke-stage-for-led-walls/">Foundry expands Nuke Stage for LED walls</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/playback-v22x.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A striking orange motorcycle stands prominently in a dimly lit urban environment, surrounded by an atmosphere of neon-lit streets projected onto the walls. A filming camera is positioned nearby, capturing the scene, creating a dynamic blend of technology and artistry." /></div><div><p class="wp-block-paragraph">The latest update to <a href="https://www.foundry.com/products/nuke-stage" title="">Nuke Stage</a> stays with the <a href="https://digitalproduction.com/2025/04/02/nuke-stage-foundry-introduces-a-virtual-production-tool/" title="">preview</a>: real-time playback of photoreal environments on LED walls, plus live compositing tools that shoulkd feel familiar to artists. The tool targets virtual production and in camera visual effects (<a href="https://digitalproduction.com/tag/icvfx" title="">ICVFX</a>), with a workflow designed to keep VFX artists in creative control on set.</p>
<span hidden class="__iawmlf-post-loop-links" data-iawmlf-links="[{&quot;id&quot;:1892,&quot;href&quot;:&quot;https:\/\/www.foundry.com\/products\/nukestage&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/www.foundry.com\/products\/nuke-stage&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14382,&quot;href&quot;:&quot;https:\/\/notchlc.notch.one\/?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20260513050259\/https:\/\/notchlc.notch.one\/?utm_source=chatgpt.com&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2026-05-13 07:20:16&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-13 07:20:16&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14383,&quot;href&quot;:&quot;https:\/\/openusd.org\/?&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:13677,&quot;href&quot;:&quot;https:\/\/openexr.com&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20221103152822\/https:\/\/www.openexr.com\/&quot;,&quot;redirect_href&quot;:&quot;https:\/\/openexr.com\/en\/latest\/&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2026-03-17 08:20:12&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-21 12:04:27&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-25 09:18:20&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-28 14:33:43&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-31 14:49:36&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-04 14:41:37&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-08 19:29:32&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-11 21:11:32&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-17 16:22:08&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-20 18:02:41&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-23 19:23:27&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-27 03:25:42&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-30 10:40:39&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-03 21:28:41&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-07 17:08:27&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-13 07:20:17&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-13 07:20:17&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:1781,&quot;href&quot;:&quot;https:\/\/opencolorio.org&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20251227061527\/https:\/\/opencolorio.org\/&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2025-12-27 21:25:32&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-01-02 14:30:28&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-01-12 00:21:09&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-01-17 13:06:40&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-01-21 22:04:56&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-02-11 13:30:04&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-02-18 18:22:21&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-02-24 07:20:55&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-08 05:29:34&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-19 07:27:10&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-22 10:14:48&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-25 10:27:33&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-28 13:35:55&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-03-31 14:22:04&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-03 15:41:20&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-06 22:35:16&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-10 02:34:14&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-13 13:40:44&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-17 07:34:03&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-20 08:07:47&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-23 10:25:26&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-27 12:50:01&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-04-30 13:58:26&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-05-04 07:59:53&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-05-07 08:45:24&quot;,&quot;http_code&quot;:206},{&quot;date&quot;:&quot;2026-05-10 20:53:42&quot;,&quot;http_code&quot;:206}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-10 20:53:42&quot;,&quot;http_code&quot;:206},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:1121,&quot;href&quot;:&quot;https:\/\/acescentral.com&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20251211054934\/https:\/\/acescentral.com\/&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2025-12-27 17:22:10&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-04 08:19:45&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-14 17:17:52&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-19 23:56:13&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-01 04:17:51&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-07 20:09:11&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-12 21:49:51&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-18 04:54:20&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-21 21:57:25&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-03 21:42:31&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-19 16:08:16&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-23 16:33:59&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-02 02:41:49&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-07 12:27:56&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-12 18:49:02&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-16 09:39:42&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-20 15:36:46&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-24 11:22:49&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-04 12:48:50&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-12 06:24:14&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-12 06:24:14&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14384,&quot;href&quot;:&quot;https:\/\/www.foundry.com\/news-and-awards\/foundry-releases-nuke-17-advancing-compositing-workflows&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;}]"></span>


<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe class="youtube-player" width="1200" height="675" src="https://www.youtube.com/embed/UyfHzf9K978?version=3&rel=1&showsearch=0&showinfo=1&iv_load_policy=1&fs=1&hl=en-US&autohide=2&wmode=transparent" allowfullscreen="true" style="border:0;" sandbox="allow-scripts allow-same-origin allow-popups allow-presentation allow-popups-to-escape-sandbox"></iframe>
</div></figure>



<p class="wp-block-paragraph">The core stays high-resolution playback for 2D and 2.5D environments, with EXR as the playback format and <a href="https://digitalproduction.com/tag/usd/" title="USD">USD </a>used for 3D geometry. Live compositing comes from a set of Nuke nodes rewritten for real-time performance, so operators can blend the on-set environment with the physical build and manage the relationship between camera and screen.</p>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/4.%20Creative%20Control.jpeg?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/4.%20Creative%20Control.jpeg?itok=HQ_VELSN"  style="width:428px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">The practical takeaway: this is not a game engine project that happens to output to a wall. It is a VFX-flavoured playback and live comp system that tries to keep your pipeline language intact.</p>



<h3 id="the-parts-you-actually-run-on-set" class="wp-block-heading">The parts you actually run on set</h3>



<p class="wp-block-paragraph">Foundry built a three part architecture: an editor where the operator works, a headless networking relay, and render nodes that run on each machine driving the wall. A launcher defines where those processes run and can start them across a larger stage, then the editor connects to the relay and render nodes and reports cluster health.</p>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/9.%20Nuke%20Interface.jpeg?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/9.%20Nuke%20Interface.jpeg?itok=26hiMAES"  style="width:428px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">A typical topology puts the editor and relay on the control machine, distributes renders to the render nodes, then sends images to the wall. The setup flow walks through networking first, then hardware outputs and physical displays, then display mapping, and then cameras and tracking. Configurations can be exported and imported so teams do not have to repeat the same stage setup every day. Unless they want to.  But in general, that is the unglamorous part of <a href="https://digitalproduction.com/tag/virtual-production/?utm_source=chatgpt.com">virtual production</a> that still decides whether you shoot before lunch or after dinner.</p>



<h3 id="metadata-vault-your-future-self-says-thanks" class="wp-block-heading">Metadata Vault: your future self says thanks</h3>



<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/6.%20NukeStage_Vault_001.png?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/6.%20NukeStage_Vault_001.png?itok=pW5I9BTM" ></figure>



<p class="wp-block-paragraph">The headline feature is metadata capture via a Vault that records on-set decisions for handoff into post. The Nuke Stage supports logging scene data, camera tracking, lens metadata, timecode, scene settings, and colour decisions. The Vault documentation also describes two capture workflows: Snapshots for a point-in-time save, and Take Record, which creates a camera USD file between defined start and end points.</p>



<p class="wp-block-paragraph">If you have ever tried to reproduce a wall look from a vague note like “match take three but slightly warmer with some *Ooompf* “, this aims to make the record concrete. It will not fix bad habits, but it can at least preserve what you actually did on set.</p>



<h3 id="notchlc-and-exr-playback-plus-a-faster-sequencer-loop" class="wp-block-heading">NotchLC and EXR playback, plus a faster sequencer loop</h3>



<p class="wp-block-paragraph">On the playback side, the feature list now includes native support for <a href="https://notchlc.notch.one/?utm_source=chatgpt.com">NotchLC</a> and B44 EXR for high-resolution background playback. There is a drag-and-drop sequencer panel, one-click keyframing, and editing and interpolation tools for timing changes.</p>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/card/public/media/image/2026-05/2-sequencer-video-1920x10801%402x.jpg?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/card/public/media/image/2026-05/2-sequencer-video-1920x10801@2x.jpg?itok=Bo6MgdST"  style="aspect-ratio:1.7778077680542133;width:370px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">In the operator view, the track-based sequencer remains central. Content created in Nuke can be staged over time, keyframed, and adjusted live for collaborative iteration with the on-set team.</p>



<h3 id="usd-scenes-and-splats-now-aimed-at-the-wall" class="wp-block-heading">USD scenes and splats, now aimed at the wall</h3>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/5.%20USD.jpeg?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/5.%20USD.jpeg?itok=dxyOuowR"  style="aspect-ratio:1.893447990994323;width:369px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">Nuke Stage supports importing <a href="https://openusd.org/?" title="">OpenUSD</a> scenes that can be edited and overridden for real-time blending of virtual and physical sets. For file and colour standards, the tool lists <a href="https://openexr.com/" title="">OpenEXR</a> and <a href="https://opencolorio.org/" title="">OpenColorIO</a> alongside ACEScg support via <a href="https://acescentral.com/" title="">ACES</a>. It also claims support for HDR and a color pipeline intended to carry creative decisions from set into post.</p>



<p class="wp-block-paragraph">For scene representations, the tool now highlights importing and controlling Gaussian Splats for high fidelity 3D scenes, alongside standard USD geometry. In the broader ecosystem, <a href="https://www.foundry.com/news-and-awards/foundry-releases-nuke-17-advancing-compositing-workflows" title="">Foundry</a> also shipped Gaussian Splat support in Nuke and built a USD based 3D system in there too, which positions splats and USD as shared data types across stage and post.</p>



<h3 id="standard-hardware-scaling-and-the-usual-fine-print" class="wp-block-heading">Standard hardware, scaling, and the usual fine print</h3>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/7.%20Control%20Cost.jpeg?w=1200&ssl=1"  alt="https://www.foundry.com/sites/default/files/styles/hero_1440_x_825_/public/2026-04/7.%20Control%20Cost.jpeg?itok=60bafO0o"  style="width:410px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">The product page repeats the goal of running on commodity hardware without specialist media servers or bespoke equipment, and scaling across render node clusters. It frames this as cost and setup-time control and as a way to make stages repeatable.</p>



<p class="wp-block-paragraph">Pricing is not specified in the sources. The product page offers a register your interest contact flow.</p>



<p class="wp-block-paragraph">New tools and innovations should be tested before use in production, especially anything that touches playback codecs, cluster timing, genlock, tracking inputs, and colour consistency. Run your own worst-case plate, your own camera tracking feed, and your own handoff into post before you bet a shoot day on it.</p>



<p class="wp-block-paragraph"><a href="https://www.foundry.com/products/nuke-stage" title="">https://www.foundry.com/products/nuke-stage</a></p><p>The post <a href="https://digitalproduction.com/2026/05/13/foundry-expands-nuke-stage-for-led-walls/">Foundry expands Nuke Stage for LED walls</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></content:encoded>
					
		
		
		<enclosure url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/playback-v22x.jpg?fit=1440%2C810&#038;quality=80&#038;ssl=1" length="76899" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/playback-v22x.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" medium="image" type="image/jpeg">
	<media:copyright>DIGITAL PRODUCTION</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[A striking orange motorcycle stands prominently in a dimly lit urban environment, surrounded by an atmosphere of neon-lit streets projected onto the walls. A filming camera is positioned nearby, capturing the scene, creating a dynamic blend of technology and artistry.]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/playback-v22x.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" />
<post-id xmlns="com-wordpress:feed-additions:1">277575</post-id>	</item>
		<item>
		<title>ZEISS CinCraft LensCore targets Nuke lens looks</title>
		<link>https://digitalproduction.com/2026/05/12/zeiss-cincraft-lenscore-targets-nuke-lens-looks/</link>
		
		<dc:creator><![CDATA[Bela Beier]]></dc:creator>
		<pubDate>Tue, 12 May 2026 08:00:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[topnews]]></category>
		<category><![CDATA[CinCraft LensCore]]></category>
		<category><![CDATA[Compositing]]></category>
		<category><![CDATA[FMX]]></category>
		<category><![CDATA[lens distortion]]></category>
		<category><![CDATA[Lenscore]]></category>
		<category><![CDATA[nuke plugin]]></category>
		<category><![CDATA[scenario]]></category>
		<category><![CDATA[Simulation]]></category>
		<category><![CDATA[The Foundry]]></category>
		<category><![CDATA[VFX]]></category>
		<category><![CDATA[Zeiss]]></category>
		<guid isPermaLink="false">https://digitalproduction.com/?p=276310</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/1635fee8-66d0-413a-a72c-24ea227ecd85.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A dimly lit alleyway is filled with vibrant neon signs in Japanese, casting colorful reflections on wet surfaces. Empty tables and chairs hint at recent activity, while the blurred background suggests a lively atmosphere of food stalls awaiting patrons." /></div><div><p>LensCore brings lens profiles, ray-traced lens behavior, and physics-driven controls to Nuke, aiming for fast, repeatable lens looks across sequences.</p>
<p>The post <a href="https://digitalproduction.com/2026/05/12/zeiss-cincraft-lenscore-targets-nuke-lens-looks/">ZEISS CinCraft LensCore targets Nuke lens looks</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/1635fee8-66d0-413a-a72c-24ea227ecd85.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A dimly lit alleyway is filled with vibrant neon signs in Japanese, casting colorful reflections on wet surfaces. Empty tables and chairs hint at recent activity, while the blurred background suggests a lively atmosphere of food stalls awaiting patrons." /></div><div><p class="wp-block-paragraph"><em>For those who don’t know the tool: <a href="https://www.zeiss.com/photonics-and-optics/en/cinematography/cincraft/lenscore.html?utm_source=chatgpt.com">CinCraft LensCore</a> is a <a href="https://www.foundry.com/products/nuke-family/nuke?utm_source=chatgpt.com">Nuke</a> plugin for <a href="https://digitalproduction.com/tag/compositing/?utm_source=chatgpt.com">compositing</a> and it sits late in post inside the <a href="https://cincraft.zeiss.com/?utm_source=chatgpt.com">CinCraft</a> lens data ecosystem.</em></p>
<span hidden class="__iawmlf-post-loop-links" data-iawmlf-links="[{&quot;id&quot;:14362,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/en\/cinematography\/cincraft\/lenscore.html?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14363,&quot;href&quot;:&quot;https:\/\/www.foundry.com\/products\/nuke-family\/nuke?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14364,&quot;href&quot;:&quot;https:\/\/cincraft.zeiss.com\/?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14365,&quot;href&quot;:&quot;https:\/\/www.zeiss.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/www.zeiss.com\/corporate\/en\/home.html&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14366,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/us\/cinematography\/cincraft\/virtual-lens-technology.html?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/us\/cinematography\/cincraft\/lenscore.html?utm_source=chatgpt.com&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14367,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/en\/cinematography\/cincraft\/scenario.html?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14368,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/en\/cinematography\/cincraft\/mapper.html?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14369,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/en\/cinematography\/know-how-hub\/extended-data.html?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14370,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/en\/home\/content\/newsroom\/news-overview\/2026\/cincraft-lenscore.html&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14371,&quot;href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/us\/cinematography\/cincraft\/virtual-lens-technology.html&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/www.zeiss.com\/photonics-and-optics\/us\/cinematography\/cincraft\/lenscore.html?utm_source=chatgpt.com&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;}]"></span>


<h3 id="ray-traced-lens-behavior-inside-a-comp" class="wp-block-heading">Ray traced lens behavior, inside a comp</h3>



<p class="wp-block-paragraph"><a href="https://www.zeiss.com/">ZEISS</a> calls CinCraft LensCore as a physically based way to create cinematic lens looks for visual effects and animation in a 2D comp. It builds on <a href="https://www.zeiss.com/photonics-and-optics/us/cinematography/cincraft/virtual-lens-technology.html?utm_source=chatgpt.com">Virtual Lens Technology</a>, shown at <a href="https://digitalproduction.com/tag/fmx/?utm_source=chatgpt.com">FMX</a> 2025, and it targets the hand-built lens look stacks that often drift from shot to shot once schedules get real. Digital Production has tracked this rollout from the earlier Virtual Lens Technology push, first with <a href="https://digitalproduction.com/2025/05/06/virtual-glass-zeiss-enters-the-simulated-optics-game/?utm_source=chatgpt.com">Virtual Glass: ZEISS enters the simulated optics game</a> and later with <a href="https://digitalproduction.com/2025/10/17/zeiss-cincraft-virtual-lens-enters-beta-real-glass-virtual-magic/">ZEISS CinCraft Virtual Lens enters beta: real glass, virtual magic</a>, which sets LensCore up as the compositing side of that same lens data story.</p>



<p class="wp-block-paragraph">LensCore centres on a GPU-accelerated, ray-traced rendering engine for Nuke that simulates lens behaviour across every pixel and every frame, apparently exceeding typical digital lens effects. The demo at the FMX booth looked great, but you’ll have to test it yourself with your own footage and renders. </p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/rendered-with-cincraft-lenscore.jpg?quality=80&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/rendered-with-cincraft-lenscore.jpg?resize=1200%2C675&quality=80&ssl=1"  alt="A narrow, dimly-lit street lined with colorful neon signs in various hues. Empty metal tables and stools rest under a shelter, reflecting the wet pavement. The atmosphere is vibrant yet serene, evoking the lively ambiance of an urban night market."  class="wp-image-277274" ></a></figure>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/raw-cg-render.jpg?quality=80&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="675"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/raw-cg-render.jpg?resize=1200%2C675&quality=80&ssl=1"  alt="A vibrant, neon-lit alleyway bustling with energy. Colorful signs in various languages illuminate the wet pavement, reflecting their glow. Tables and stools are scattered across the scene, while a figure leans against a wall, creating a lively urban atmosphere."  class="wp-image-277275" ></a></figure>



<h3 id="one-click-looks-with-a-shelf-of-profiles" class="wp-block-heading">One click looks, with a shelf of profiles</h3>



<p class="wp-block-paragraph">LensCore applies a complete lens look with one click, covering bokeh, defocus, distortion, vignetting, and other optical effects tied to a specific physical lens. A digital lens shelf lets artists load lens profiles for real cinema lenses or custom presets and quickly compare looks, aiming to enable repeatable workflows across sequences and teams.</p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/zeiss-cincraft-lenscore_lens-selection.jpg?quality=80&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1200"  height="1672"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/zeiss-cincraft-lenscore_lens-selection.jpg?resize=1200%2C1672&quality=80&ssl=1"  alt="A dark user interface displayed on a computer screen features a drop-down menu showing various camera lens options. The selected lens is the &#039;ZEISS Supreme Prime 50.&#039; Other settings for aperture, focus, vignetting, distortion, and chromatic aberration are also visible on the screen."  class="wp-image-277278" ></a></figure>



<p class="wp-block-paragraph">Zeiss also lists lens characteristics it aims to capture, including sharpness, focus fall off, cat eye, chromatic aberration, distortion, and dirt filters. That part matters for day-to-day <a href="https://digitalproduction.com/tag/vfx/?utm_source=chatgpt.com">VFX</a> work because the lens look rarely comes from a single knob, it comes from how those traits interact when you animate focus and exposure. About a third of the way into your show, the difference between a reusable lens profile and a fragile hero setup usually shows up in the vendor handoffs.</p>



<h3 id="focus-changes-that-include-breathing-and-cat-eye" class="wp-block-heading">Focus changes that include breathing and cat eye</h3>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="nerl08UrXs2jDmsiwtDLYNqUx8XKF4kTYzbBo7houy16ACjQIREZPTQMCVbGxRO6cS95vZ"><blockquote class="wp-embedded-content" data-secret="03QbXbigni"><a href="https://digitalproduction.com/2025/05/06/virtual-glass-zeiss-enters-the-simulated-optics-game/">Virtual Glass: ZEISS Enters the Simulated Optics Game</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“Virtual Glass: ZEISS Enters the Simulated Optics Game” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/05/06/virtual-glass-zeiss-enters-the-simulated-optics-game/embed/#?secret=NCG8kOSFWk#?secret=03QbXbigni" data-secret="03QbXbigni" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph">The LensCore digital lenses are provided per focal length, representing one real lens per digital lens. It also claims that the look changes with focus, including breathing and bokeh behaviour, and that cat eye bokeh applies in one step based on the selected lens. We are not well-versed enough in Zeiss Glass to verify that from a screen… Get the demo when it drops, and check for yourself. </p>



<p class="wp-block-paragraph">The tool wants to work with both the company’s lenses and non-company lenses, and users can mimic practically any lens look – artists can still adjust lens characteristics such as chromatic aberrations after applying a lens , with the goal of keeping the experience technically credible and comparable to physically correct lens behaviour.</p>



<p class="wp-block-paragraph"></p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/zeiss-cincraft-lenscore_lens-customization.jpg?quality=80&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1099"  height="2000"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/zeiss-cincraft-lenscore_lens-customization.jpg?resize=1099%2C2000&quality=80&ssl=1"  alt="A screenshot of a digital photo editing application interface showcases various controls for adjusting lens effects. The layout features labeled sliders for bokeh preview, aperture settings, focus adjustments, and other image enhancement options, each organized in clearly defined sections."  class="wp-image-277280" ></a></figure>



<h3 id="depth-helpers-inpaint-and-performance-notes" class="wp-block-heading">Depth helpers, inpaint, and performance notes</h3>



<p class="wp-block-paragraph">LensCore includes an inpaint feature that fills occluded areas behind defocused objects to reduce complex 3D setups and speed up compositing workflows. On performance, the GPU-based rendering hould mean that high-end renderings stay within time units artists already expect. There are also “fidelity options” and that can be adapted to your needs, including but not limited to serial-number-based approaches.</p>



<h3 id="where-it-sits-in-a-wider-lens-data-stack" class="wp-block-heading">Where it sits in a wider lens data stack</h3>



<figure class="wp-block-embed alignright is-type-wp-embed is-provider-digital-production wp-block-embed-digital-production"><div class="wp-block-embed__wrapper">
<span class="5YjW6Top7dVDNcyUmt1krvHgezXIqPOSBRaJGQE2bK"><blockquote class="wp-embedded-content" data-secret="IVGwAVI1Md"><a href="https://digitalproduction.com/2025/10/17/zeiss-cincraft-virtual-lens-enters-beta-real-glass-virtual-magic/">ZEISS CinCraft Virtual Lens Enters BETA: Real Glass, Virtual Magic</a></blockquote><iframe class="wp-embedded-content" sandbox="allow-scripts" security="restricted"  title="“ZEISS CinCraft Virtual Lens Enters BETA: Real Glass, Virtual Magic” — DIGITAL PRODUCTION" src="https://digitalproduction.com/2025/10/17/zeiss-cincraft-virtual-lens-enters-beta-real-glass-virtual-magic/embed/#?secret=Q7XA99mv0l#?secret=IVGwAVI1Md" data-secret="IVGwAVI1Md" width="600" height="338" frameborder="0" marginwidth="0" marginheight="0" scrolling="no"></iframe></span>
</div></figure>



<p class="wp-block-paragraph">LensCore sits next to other CinCraft components used earlier in the pipeline. <a href="https://www.zeiss.com/photonics-and-optics/en/cinematography/cincraft/scenario.html?utm_source=chatgpt.com">CinCraft Scenario</a> is an indoor-outdoor camera tracking solution for real-time applications, plus lens and point cloud data recording and export for post. <a href="https://www.zeiss.com/photonics-and-optics/en/cinematography/cincraft/mapper.html?utm_source=chatgpt.com">CinCraft Mapper</a> is a service providing frame-accurate lens distortion and shading data for post, including GUI and command line use on Linux, Windows, and macOS. The same files work with the <a href="https://www.zeiss.com/photonics-and-optics/en/cinematography/know-how-hub/extended-data.html?utm_source=chatgpt.com">eXtended Data</a> (a lens-embedded technology introduced in 2017, part of the foundation for the CinCraft ecosystem).</p>



<h3 id="availability-demo-and-licensing" class="wp-block-heading">Availability, demo, and licensing</h3>



<p class="wp-block-paragraph">LensCore was demonstrated at FMX 2026 in Stuttgart at booth 2.1 in the Marketplace. Worldwide availability is set for June 1, 2026 through the CinCraft webshop, with multiple licenses available. The LensCore demo version becomes available on June 1, 2026. Pricing is not yet specified; you’ll have to wait three more weeks for that.  As always, test new tools before use in production, especially on shots with heavy grain, fine detail, fast motion, and tight edge work.</p>



<p class="wp-block-paragraph">Digital Production will publish an interview soon – if you have any specific questions you’d want to ask (About Lenscore and Zeiss, not in General – I am NOT falling for that AGAIN), drop me a line through the contact page! </p>



<p class="wp-block-paragraph"><br /><a href="https://www.zeiss.com/photonics-and-optics/en/home/content/newsroom/news-overview/2026/cincraft-lenscore.html" title="">https://www.zeiss.com/photonics-and-optics/en/home/content/newsroom/news-overview/2026/cincraft-lenscore.html</a><br /><br /><a href="https://www.zeiss.com/photonics-and-optics/us/cinematography/cincraft/virtual-lens-technology.html" title="">https://www.zeiss.com/photonics-and-optics/us/cinematography/cincraft/virtual-lens-technology.html</a></p><p>The post <a href="https://digitalproduction.com/2026/05/12/zeiss-cincraft-lenscore-targets-nuke-lens-looks/">ZEISS CinCraft LensCore targets Nuke lens looks</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></content:encoded>
					
		
		
		<enclosure url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/1635fee8-66d0-413a-a72c-24ea227ecd85.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" length="86771" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/1635fee8-66d0-413a-a72c-24ea227ecd85.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" medium="image" type="image/jpeg">
	<media:copyright>DIGITAL PRODUCTION</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[A dimly lit alleyway is filled with vibrant neon signs in Japanese, casting colorful reflections on wet surfaces. Empty tables and chairs hint at recent activity, while the blurred background suggests a lively atmosphere of food stalls awaiting patrons.]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/1635fee8-66d0-413a-a72c-24ea227ecd85.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" />
<post-id xmlns="com-wordpress:feed-additions:1">276310</post-id>	</item>
		<item>
		<title>Blender 5.2 LTS targets texture-heavy renders</title>
		<link>https://digitalproduction.com/2026/05/12/blender-5-2-lts-targets-texture-heavy-renders/</link>
		
		<dc:creator><![CDATA[Bela Beier]]></dc:creator>
		<pubDate>Tue, 12 May 2026 06:00:00 +0000</pubDate>
				<category><![CDATA[News]]></category>
		<category><![CDATA[3d-texture]]></category>
		<category><![CDATA[Blender]]></category>
		<category><![CDATA[Blender Cycles]]></category>
		<category><![CDATA[blender-cycles]]></category>
		<category><![CDATA[blender-workflow]]></category>
		<category><![CDATA[cloud-rendering]]></category>
		<category><![CDATA[render-manager]]></category>
		<category><![CDATA[texture-painting]]></category>
		<category><![CDATA[vfx-pipelines]]></category>
		<guid isPermaLink="false">https://digitalproduction.com/?p=276327</guid>

					<description><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/mip_level_visualization-1536x864-1.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A surreal urban scene bursting with vibrant colors, featuring bright green and blue checkered buildings. A whimsical store with a red striped awning stands on a checkered street, alongside a bright red scooter. The kaleidoscopic environment creates a dreamlike effect." /></div><div><p>Blender 5.2 LTS adds a Cycles texture cache that builds .tx tiles and loads only what the render needs. Less VRAM waste, more scene headroom.</p>
<p>The post <a href="https://digitalproduction.com/2026/05/12/blender-5-2-lts-targets-texture-heavy-renders/">Blender 5.2 LTS targets texture-heavy renders</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></description>
										<content:encoded><![CDATA[<div style="margin: 5px 5% 10px 5%;"><img src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/mip_level_visualization-1536x864-1.jpg?fit=1200%2C675&quality=80&ssl=1" width="1200" height="675" title="" alt="A surreal urban scene bursting with vibrant colors, featuring bright green and blue checkered buildings. A whimsical store with a red striped awning stands on a checkered street, alongside a bright red scooter. The kaleidoscopic environment creates a dreamlike effect." /></div><div><p class="wp-block-paragraph"><em>For those who don’t know the topic: <a href="https://www.blender.org">Blender</a> is a DCC with the <a href="https://www.blender.org/features/rendering/">Cycles</a> renderer. The new texture cache sits in Cycles rendering performance, alongside existing memory and render speed controls.</em></p>
<span hidden class="__iawmlf-post-loop-links" data-iawmlf-links="[{&quot;id&quot;:165,&quot;href&quot;:&quot;https:\/\/www.blender.org&quot;,&quot;archived_href&quot;:&quot;http:\/\/web-wp.archive.org\/web\/20251226195249\/https:\/\/www.blender.org\/&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[{&quot;date&quot;:&quot;2025-12-27 12:37:36&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2025-12-30 14:16:28&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-02 18:10:17&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-06 00:19:09&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-09 01:35:27&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-12 09:05:03&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-16 03:16:29&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-19 08:27:20&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-22 15:10:28&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-25 21:30:51&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-01-29 01:45:47&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-01 10:23:52&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-05 01:10:22&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-08 02:24:01&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-11 13:33:04&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-14 17:45:48&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-17 18:52:38&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-20 22:44:56&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-24 08:42:54&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-02-27 09:02:54&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-02 14:04:53&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-05 17:54:53&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-09 01:44:09&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-12 08:40:17&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-15 10:57:50&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-18 11:16:25&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-21 12:26:16&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-24 15:31:48&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-03-27 17:40:38&quot;,&quot;http_code&quot;:503},{&quot;date&quot;:&quot;2026-03-30 20:28:00&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-02 20:40:15&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-06 03:52:42&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-09 04:37:15&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-12 06:41:48&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-15 07:13:53&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-18 08:32:57&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-21 08:59:42&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-24 09:05:29&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-27 09:38:37&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-04-30 10:01:33&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-03 10:47:16&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-06 10:49:19&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-09 12:18:57&quot;,&quot;http_code&quot;:200},{&quot;date&quot;:&quot;2026-05-12 12:45:47&quot;,&quot;http_code&quot;:200}],&quot;broken&quot;:false,&quot;last_checked&quot;:{&quot;date&quot;:&quot;2026-05-12 12:45:47&quot;,&quot;http_code&quot;:200},&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:13640,&quot;href&quot;:&quot;https:\/\/www.blender.org\/features\/rendering&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/www.blender.org\/features\/rendering\/&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14360,&quot;href&quot;:&quot;https:\/\/code.blender.org\/2026\/05\/cycles-texture-cache&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;https:\/\/code.blender.org\/2026\/05\/cycles-texture-cache\/&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;},{&quot;id&quot;:14361,&quot;href&quot;:&quot;https:\/\/developer.blender.org\/docs\/release_notes\/5.2\/?utm_source=chatgpt.com&quot;,&quot;archived_href&quot;:&quot;&quot;,&quot;redirect_href&quot;:&quot;&quot;,&quot;checks&quot;:[],&quot;broken&quot;:false,&quot;last_checked&quot;:null,&quot;process&quot;:&quot;done&quot;}]"></span>


<h3 id="the-problem-it-aims-to-fix" class="wp-block-heading">The problem it aims to fix</h3>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/code.blender.org/wp-content/uploads/2026/04/image.png?w=1200&quality=72&ssl=1"  alt="https://code.blender.org/wp-content/uploads/2026/04/image.png"  style="width:281px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">Texture-heavy scenes can burn memory fast, even before you hit your sample count. The new texture cache in <a href="https://digitalproduction.com/tag/cycles/" title="Cycles">Cycles </a>targets that pressure by changing what gets loaded at render time, and how. The core idea: instead of pulling full image textures into memory, the system loads only the tiles and resolutions actually needed for the render. That means the renderer can avoid carrying the entire source image when the shot only uses a fraction of it, or when the texture resolves to a smaller mip level in-frame.</p>



<p class="wp-block-paragraph">This is a straight-up memory efficiency play for scenes that rely on lots of image textures. Claims about the size of the improvement exist, but specific benchmark numbers and test conditions are very difficult, because it wholly depends on your scene – so test before using it in production! </p>



<figure class="wp-block-image"><img data-recalc-dims="1" height="766" width="1200"  decoding="async"  src="https://i0.wp.com/code.blender.org/wp-content/uploads/2026/04/cycles_texture_cache_graph-1536x981.png?resize=1200%2C766&quality=72&ssl=1"  alt="https://code.blender.org/wp-content/uploads/2026/04/cycles_texture_cache_graph-1536x981.png" ></figure>



<h3 id="tx-files-generated-next-to-your-textures" class="wp-block-heading">.tx files, generated next to your textures</h3>



<p class="wp-block-paragraph">Cycles Texture Cache generates a matching .tx file for each image texture and places it in a blender_tx folder next to the source image. The .tx file stores the texture in a form optimized for tiled access and multi-resolution loading, so the renderer can pull only the tiles and mip levels it needs.</p>


<div class="wp-block-code">
	<div class="cm-editor">
		<div class="cm-scroller">
			
<pre><code><div class="cm-line"># Generate tx files for all images in a blend file.</div><div class="cm-line">blender scene.blend --command maketx</div><div class="cm-line"></div><div class="cm-line"># Generate tx file for a specific image file.</div><div class="cm-line">blender --command maketx image.png --colorspace sRGB</div></code></pre>
		</div>
	</div>
</div>


<figure class="wp-block-image"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/code.blender.org/wp-content/uploads/2026/05/image.png?w=1200&quality=72&ssl=1"  alt="https://code.blender.org/wp-content/uploads/2026/05/image.png" ></figure>



<p class="wp-block-paragraph">The workflow aims to stay mostly invisible. The .tx files regenerate automatically when the source image changes. Cycles also infers settings such as color space and filtering based on how the image gets used in shader nodes, rather than forcing artists to maintain a parallel set of cache settings by hand. That last part matters for production sanity: texutre caching that requires constant babysitting tends to get switched off right before deadline.</p>



<h3 id="cpu-gpu-and-the-day-to-day-reality-check" class="wp-block-heading">CPU, GPU, and the day-to-day reality check</h3>


<div class="wp-block-image">
<figure class="alignleft is-resized"><img data-recalc-dims="1"  decoding="async"  src="https://i0.wp.com/code.blender.org/wp-content/uploads/2026/05/image-1.png?w=1200&quality=72&ssl=1"  alt="https://code.blender.org/wp-content/uploads/2026/05/image-1.png"  style="width:393px;height:auto" ></figure>
</div>


<p class="wp-block-paragraph">The feature is working across CPU rendering and all GPU backends. It is also already available in daily builds, with additional improvements planned for render farms and production pipeline use. </p>



<p class="wp-block-paragraph">If you manage shared storage, you will care about where those .tx files live, who writes them, and when. The currently accessible details do not fully spell out a studio-grade policy story for cache location, locking, or multi-user coordination. Still, the direction is clear: get memory use under control without turning texture management into another job title.</p>



<figure class="wp-block-image size-full"><a href="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/image-10.png?quality=72&ssl=1"><img data-recalc-dims="1"  decoding="async"  width="1177"  height="445"  sizes="(max-width: 1200px) 100vw, 1200px"  src="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/image-10.png?resize=1177%2C445&quality=72&ssl=1"  alt="Four application icons for the Liquid Glass app on macOS, showcasing different styles: Default with a vibrant gradient background of pink and orange, Dark with a deep background, and two monochrome variants in gray tones, each featuring a distinctive logo design."  class="wp-image-277322" ></a></figure>



<h3 id="schedule-availability-and-pricing" class="wp-block-heading">Schedule, availability, and pricing</h3>



<p class="wp-block-paragraph">Blender 5.2 LTS is in alpha, with a full release expected in July. Pricing is the same as always for Blender – free and Open Source. Before you roll this into a show, test it on your own shots, your own storage, and your own farm tooling, because new caching systems love to surface edge cases at the worst possible time.<br /><br /><a href="https://code.blender.org/2026/05/cycles-texture-cache/" title="">https://code.blender.org/2026/05/cycles-texture-cache/</a></p>



<p class="wp-block-paragraph"><a href="https://developer.blender.org/docs/release_notes/5.2/?utm_source=chatgpt.com">https://developer.blender.org/docs/release_notes/5.2/</a></p><p>The post <a href="https://digitalproduction.com/2026/05/12/blender-5-2-lts-targets-texture-heavy-renders/">Blender 5.2 LTS targets texture-heavy renders</a> first appeared on <a href="https://digitalproduction.com">DIGITAL PRODUCTION</a> and was written by <a href="https://digitalproduction.com/author/qualityjellyfish45275761d0/">Bela Beier</a>. </p></div>]]></content:encoded>
					
		
		
		<enclosure url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/mip_level_visualization-1536x864-1.jpg?fit=1536%2C864&#038;quality=80&#038;ssl=1" length="270907" type="image/jpg" />
<media:content xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/mip_level_visualization-1536x864-1.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" medium="image" type="image/jpeg">
	<media:copyright>DIGITAL PRODUCTION</media:copyright>
	<media:title></media:title>
	<media:description type="html"><![CDATA[A surreal urban scene bursting with vibrant colors, featuring bright green and blue checkered buildings. A whimsical store with a red striped awning stands on a checkered street, alongside a bright red scooter. The kaleidoscopic environment creates a dreamlike effect.]]></media:description>
</media:content>
<media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://i0.wp.com/digitalproduction.com/wp-content/uploads/2026/05/mip_level_visualization-1536x864-1.jpg?fit=1200%2C675&#038;quality=80&#038;ssl=1" width="1200" height="675" />
<post-id xmlns="com-wordpress:feed-additions:1">276327</post-id>	</item>
	</channel>
</rss>