<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Subsurface_scattering</id>
	<title>Subsurface scattering - Revision history</title>
	<link rel="self" type="application/atom+xml" href="http://debianws.lexgopc.com/wiki143/index.php?action=history&amp;feed=atom&amp;title=Subsurface_scattering"/>
	<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Subsurface_scattering&amp;action=history"/>
	<updated>2026-05-14T03:57:37Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.43.1</generator>
	<entry>
		<id>http://debianws.lexgopc.com/wiki143/index.php?title=Subsurface_scattering&amp;diff=1244418&amp;oldid=prev</id>
		<title>imported&gt;LucasBrown: Importing Wikidata short description: &quot;Mechanism of light transport&quot;</title>
		<link rel="alternate" type="text/html" href="http://debianws.lexgopc.com/wiki143/index.php?title=Subsurface_scattering&amp;diff=1244418&amp;oldid=prev"/>
		<updated>2024-05-18T12:20:58Z</updated>

		<summary type="html">&lt;p&gt;Importing Wikidata &lt;a href=&quot;https://en.wikipedia.org/wiki/Short_description&quot; class=&quot;extiw&quot; title=&quot;wikipedia:Short description&quot;&gt;short description&lt;/a&gt;: &amp;quot;Mechanism of light transport&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;{{Short description|Mechanism of light transport}}&lt;br /&gt;
{{refimprove|date=October 2017}}&lt;br /&gt;
[[File:Skin Subsurface Scattering.jpg|thumb|Real-world subsurface scattering of light in a photograph of a human hand]]&lt;br /&gt;
[[File:Subsurface scattering.png|thumb|Computer-generated subsurface scattering in [[Blender (software)|Blender]]]]&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;Subsurface scattering&amp;#039;&amp;#039;&amp;#039; (&amp;#039;&amp;#039;&amp;#039;SSS&amp;#039;&amp;#039;&amp;#039;), also known as &amp;#039;&amp;#039;&amp;#039;subsurface light transport&amp;#039;&amp;#039;&amp;#039; (&amp;#039;&amp;#039;&amp;#039;SSLT&amp;#039;&amp;#039;&amp;#039;),&amp;lt;ref&amp;gt;{{cite web |title=Finish: Subsurface Light Transport |url=http://wiki.povray.org/content/Reference:Finish#Subsurface_Light_Transport |work=[[POV-Ray|POV-Ray wiki]] |date=August 8, 2012}}&amp;lt;/ref&amp;gt; is a mechanism of [[light]] transport in which light that penetrates the surface of a [[translucent]] object is [[scattering|scattered]] by interacting with the [[Material (computer graphics)|material]] and exits the surface potentially at a different point. Light generally penetrates the surface and gets scattered a number of times at irregular angles inside the material before passing back out of the material [[Diffuse reflection|at a different angle]] than it would have had if it had been [[Specular reflection|reflected &amp;#039;&amp;#039;directly&amp;#039;&amp;#039;]] off the surface.&lt;br /&gt;
&lt;br /&gt;
Subsurface scattering is important for realistic [[3D computer graphics]], being necessary for the rendering of materials such as [[marble]], [[skin]], [[leaf|leaves]], [[wax]] and [[milk]]. If subsurface scattering is not implemented, the material may look unnatural, like plastic or metal.&lt;br /&gt;
&lt;br /&gt;
==Rendering techniques==&lt;br /&gt;
[[File:ShellOpticalDescattering.png|thumb|upright=1.3|Direct surface scattering (left) plus subsurface scattering (middle) creates the final image on the right.]]&lt;br /&gt;
To improve rendering efficiency, many [[real-time computer graphics]] algorithms only compute the reflectance at the *surface* of an object. In reality, many materials are slightly translucent: light enters the surface; is absorbed, scattered and re-emitted{{snd}} potentially at a different point. Skin is a good case in point; only about 6% of reflectance is direct, 94% is from subsurface scattering.&amp;lt;ref name=&amp;quot;Krishnaswamy&amp;quot;&amp;gt;{{cite journal|last1=Krishnaswamy|first1=A|last2=Baronoski|first2=GVG|title=A Biophysically-based Spectral Model of Light Interaction with Human Skin|journal=Computer Graphics Forum|publisher=Blackwell Publishing|volume=23|issue=3|year=2004|url=http://eg04.inrialpes.fr/Programme/Papers/PDF/paper1189.pdf|doi=10.1111/j.1467-8659.2004.00764.x|pages=331|s2cid=5746906}}&amp;lt;/ref&amp;gt; An inherent property of semitransparent materials is absorption. The further through the material light travels, the greater the proportion absorbed. To simulate this effect, a measure of the distance the light has traveled through the material must be obtained.&lt;br /&gt;
&lt;br /&gt;
===Random walk SSS===&lt;br /&gt;
[[File:SSS-DragonOrangeLight-00.png|thumb|Random walk SSS in Equinox3D&amp;#039;s path-tracer.]]&lt;br /&gt;
[[File:SSS-DragonBluePlastic-00.png|thumb|Random walk SSS + PBR surface reflection in Equinox3D&amp;#039;s path-tracer.]]&lt;br /&gt;
Published by Pixar, this technique is considered the state of the art. Usually integrated into a path-tracer. It essentially simulates what happens to real photons by tracing a light path into the material, generating new paths using a lambertian distribution around the inverted normal, then picking new directions at multiple steps to scatter the light path further, hence the name &amp;quot;random walk&amp;quot;. Isotropic scattering is simulated by picking random directions evenly along a sphere. Anisotropic scattering is simulated usually by using the Henyey-Greenstein phase function. For example, human skin has anisotropic scattering. Optical depth / absorption is applied based on the length of the paths, using the Beer-Lambert law. Paths may be terminated inside the material when they reach a contribution minimum threshold or a maximum iteration count. When a path (ray) hits the surface again, it is used for gathering radiance from the scene, weighted by a lambertian distribution, as in a traditional path-tracer. This technique is intuitive and it is robust against thin geometry etc.&lt;br /&gt;
&lt;br /&gt;
===Depth Map based SSS===&lt;br /&gt;
[[Image:Sub-surface scattering depth map.svg|thumb|Depth estimation using depth maps]]&lt;br /&gt;
One method of estimating this distance is to use depth maps,&amp;lt;ref name=&amp;quot;Green&amp;quot;&amp;gt;{{cite journal|last=Green|first=Simon|title=Real-time Approximations to Subsurface Scattering|journal=GPU Gems|publisher=Addison-Wesley Professional|year=2004|pages=263–278}}&amp;lt;/ref&amp;gt; in a manner similar to [[shadow mapping]]. The scene is rendered from the light&amp;#039;s point of view into a depth map, so that the distance to the nearest surface is stored. The [[depth map]] is then projected onto it using standard [[projective texture mapping]] and the scene re-rendered. In this pass, when shading a given point, the distance from the light at the point the ray entered the surface can be obtained by a simple texture lookup. By subtracting this value from the point the ray exited the object we can gather an estimate of the distance the light has traveled through the object.{{fact|date=October 2017}}&lt;br /&gt;
&lt;br /&gt;
The measure of distance obtained by this method can be used in several ways. One such way is to use it to index directly into an artist created 1D texture that falls off exponentially with distance. This approach, combined with other more traditional lighting models, allows the creation of different materials such as [[marble]], [[jade]] and [[wax]].{{fact|date=October 2017}}&lt;br /&gt;
&lt;br /&gt;
Potentially, problems can arise if models are not convex, but [[depth peeling]]&amp;lt;ref name=&amp;quot;Nagy&amp;quot;&amp;gt;{{cite conference|last1=Nagy|first1=Z|last2=Klein|first2=R|title=Depth-Peeling for Texture-based Volume Rendering|conference=11th Pacific Conference on Computer Graphics and Applications|year=2003|pages=429–433|doi=10.1109/PCCGA.2003.1238289|isbn=0-7695-2028-6|url=http://cg.cs.uni-bonn.de/docs/publications/2003/nagy-2003-depth.pdf}}&amp;lt;/ref&amp;gt; can be used to avoid the issue. Similarly, depth peeling can be used to account for varying densities beneath the surface, such as bone or muscle, to give a more accurate scattering model.&lt;br /&gt;
&lt;br /&gt;
As can be seen in the image of the wax head to the right, light isn&amp;#039;t diffused when passing through object using this technique; back features are clearly shown. One solution to this is to take multiple samples at different points on surface of the depth map. Alternatively, a different approach to approximation can be used, known as [[texture-space]] diffusion.{{fact|date=October 2017}}&lt;br /&gt;
&lt;br /&gt;
===Texture space diffusion===&lt;br /&gt;
As noted at the start of the section, one of the more obvious effects of subsurface scattering is a general blurring of the diffuse lighting. Rather than arbitrarily modifying the diffuse function, diffusion can be more accurately modeled by simulating it in [[texture space]]. This technique was pioneered in rendering faces in &amp;#039;&amp;#039;[[The Matrix Reloaded]]&amp;#039;&amp;#039;,&amp;lt;ref name=&amp;quot;Borshukov&amp;quot;&amp;gt;{{cite journal|last1=Borshukov|first1=G|last2=Lewis|first2=J. P.|title=Realistic human face rendering for &amp;quot;The Matrix Reloaded&amp;quot;|publisher=ACM Press|journal=Computer Graphics|year=2005|url=http://www.scribblethink.org/Work/Pdfs/Face-s2003.pdf}}&amp;lt;/ref&amp;gt; but is also used in the realm of real-time rendering techniques.&lt;br /&gt;
&lt;br /&gt;
The method unwraps the mesh of an object using a vertex shader, first calculating the lighting based on the original vertex coordinates. The vertices are then remapped using the UV [[Texture mapping|texture coordinates]] as the screen position of the vertex, suitable transformed from the [0, 1] range of texture coordinates to the [-1, 1] range of normalized device coordinates. By lighting the unwrapped mesh in this manner, we obtain a 2D image representing the lighting on the object, which can then be processed and reapplied to the model as a [[light map]]. To simulate diffusion, the light map texture can simply be blurred. Rendering the lighting to a lower-resolution texture in itself provides a certain amount of blurring. The amount of blurring required to accurately model subsurface scattering in skin is still under active research, but performing only a single blur poorly models the true effects.&amp;lt;ref name=&amp;quot;d’Eon&amp;quot;&amp;gt;{{cite journal|last=d’Eon|first=E|title=Advanced Skin Rendering|journal=GDC 2007|year=2007|url=http://developer.download.nvidia.com/presentations/2007/gdc/Advanced_Skin.pdf}}&amp;lt;/ref&amp;gt; To emulate the wavelength dependent nature of diffusion, the samples used during the (Gaussian) blur can be weighted by channel. This is somewhat of an artistic process. For human skin, the broadest scattering is in red, then green, and blue has very little scattering.{{fact|date=October 2017}}&lt;br /&gt;
&lt;br /&gt;
A major benefit of this method is its independence of screen resolution; shading is performed only once per texel in the texture map, rather than for every pixel on the object. An obvious requirement is thus that the object have a good UV mapping, in that each point on the texture must map to only one point of the object. Additionally, the use of texture space diffusion provides one of the several factors that contribute to soft shadows, alleviating one cause of the realism deficiency of [[shadow mapping]].{{fact|date=October 2017}}&lt;br /&gt;
&lt;br /&gt;
==See also==&lt;br /&gt;
* [[Bidirectional scattering distribution function]]&lt;br /&gt;
&lt;br /&gt;
==References==&lt;br /&gt;
{{Reflist|2}}&lt;br /&gt;
&lt;br /&gt;
==External links==&lt;br /&gt;
* [http://graphics.ucsd.edu/~henrik/images/subsurf.html Henrik Wann Jensen&amp;#039;s subsurface scattering website]&lt;br /&gt;
* [http://graphics.ucsd.edu/~henrik/papers/bssrdf/ An academic paper by Jensen on modeling subsurface scattering]&lt;br /&gt;
* [https://www.creativecrash.com/maya/tutorials/rendering-lighting/shaders/c/subsurface-scattering-using-the-misss-fast-simple-maya-shader Subsurface Scattering: Using the Misss_Fast_Simple_Maya shader] – Maya tutorial&lt;br /&gt;
* [http://www.mrbluesummers.com/3510/3d-tutorials/3dsmax-mental-ray-sub-surface-scattering-guide/ 3d Studio Max Tutorial - The definitive guide to using subsurface scattering in 3dsMax]&lt;br /&gt;
* [[b:Blender_3D:_Noob_to_Pro/Subsurface_scattering|Subsurface scattering]] in [[Blender (software)|Blender]].&lt;br /&gt;
&lt;br /&gt;
{{DEFAULTSORT:Subsurface Scattering}}&lt;br /&gt;
[[Category:3D rendering]]&lt;br /&gt;
[[Category:Scattering, absorption and radiative transfer (optics)]]&lt;/div&gt;</summary>
		<author><name>imported&gt;LucasBrown</name></author>
	</entry>
</feed>