Each material needs a shader assigned that defines how it will be rendered. By default, all materials use Standard shader. DeepAR Studio comes with the following shaders:

  • Unlit texture: only render texture without any lighting

  • Unlit color: only render solid color without any lighting

  • Unlit texture color: render texture multiplied by color without any lighting

  • Unlit texture alpha: render texture multiplied by color without any lighting, using the alpha value from the Alpha texture

  • Skin blend: a more complex shader that can be used to blend the texture with underlying skin.

  • Simple morph: a shader used with "Dynamic UVs" enabled using the UV2 set of texture coordinates.

  • Simple Morph Tex: same as the above, but uses diffuse texture.

  • Reflection Cubemap: reflects given skybox (cubemap) on material which is additionally mixed with given color parameter. Recommended reflection shader.

  • Reflection Single Image: reflects given image on material which is additionally mixed with given color parameter. Also provides scale parameters for texture and normals.

  • Standard shader, Standard shader diffuse, and Standard shader with textures: explained in next section

  • LUT Fixed Intensity: A shader used with Postprocessing effects to apply custom color transformations to the frame.

  • LUT With Alpha: Same as above but with alpha texture as a mask

  • Background blur: Shader that produces the blur effect on the material

  • Beauty simple: Shader that applies adaptive smoothing on the material. Usually used on the face to accomplish the soft skin/beauty effect.

  • Beauty LUT: Same as "Beauty simple" shader with additional LUT that can be applied on the material.

  • Chroma key: The "green screen" effect shader. It subtracts the selected color from the texture. Usually used with JPEG textures that don't support alpha transparency to achieve a transparent effect.

  • MatCap - Uses material capture textures that contain lighting and reflections information. The MatCap shader requires a spherical image as a source.

  • MatCap Normal Map - Same as above, but with a normal map.

  • PBR Shaders - an advanced shading technique that utilizes physically based rendering and image-based lighting

You can find the source code of our shaders here:

You can use these shaders for reference or edit them to create your own custom shaders.

Standard shader

Standard shaders variations are special shaders that take into account material properties and scene lighting to calculate the final fragment color. Blinn-Phong lighting model is used for the calculation. Standard shaders take into account 4 light sources:

  • Ambient light: represents global illumination in the scene, meaning it approximates all the light scattering in the scene. It is set by changing the "Ambient color" and intensity parameters.

  • Directional light: a light model component that approximates the light source that is far away and which all light rays are parallel (e.g. sunlight). It is defined by color, light direction, and intensity parameters.

  • Two-point lights: component of the light model which is positioned somewhere in the scene and shines in all directions. Defined by position, color, and intensity. This type of light has less impact on the object the further it is from the light source (light attenuation).

Every light source intensity can be modified with the "Light intensity" uniform parameter. This is a 3 value field where the first value controls the directional light intensity, second and third control the intensity of point light 1 and 2, respectively. Default values are 1.0 for each light. NOTE: the initial setup might be too bright for some materials, so try turning off some lights before setting material parameters (set intensity to 0.0).

The specular component of the light can be controlled with the "Specular color" parameter. The Alpha channel of the chosen color acts as a specular intensity modifier. If the alpha is set to 0 there will be no specular light component of the material. In the "Standard shader with textures," this component can be defined with specular map texture which defines the specular value of the material for each fragment.

The diffuse component of the material can be either set by the "Diffuse color" parameter in the basic "Standard shader" version, or by diffuse texture in the other two versions of the standard shader. This component defines the color of the material.

Custom shaders

Users can write their own shaders and use them in the DeepAR Studio. To do so you need to do the following:

  • Write shaders and save them somewhere on the file system

  • Instruct DeepAR Studio where the shaders are

  • Compile the shaders from DeepAR Studio

  • Use them on materials

Writing your own shaders

To use custom shaders in the DeepAR Studio, the user must first write one. DeepAR SDK uses bgfx library as part of its 3D engine. All custom shader code must be written in bgfx's cross-platform shader language ("BCPSL"). It is similar to GLSL with slight differences which are described here. Additionally, DeepAR Studio provides full code of all embedded shaders which can be used as a reference. Few rules must be followed to successfully use a custom shader in the DeepAR Studio:

One shader, one folder

All shader code for a given shader must be written inside a single folder. Name of the folder represents a shader name which is important later on (in the following explanations we will use "shader_name" as a string that represents the name of the shader).

One shader, 4 files

Shader definition is given with 4 files which are all placed in the folder mentioned in the previous step. Those are:,, and shader_name.json. We'll explain each one individually.

Represents the vertex shader written in BCPSL. The name of the file must adhere to the following rules:

  • vs_ - fixed prefix

  • shader_name - the name of the shader. Must be the same as the parent folder

  • .sc - fixed suffix

Here is an example vertex shader definition in BCPSL (the embedded standard_shader_full vertex shader ):

$input a_position, a_normal, a_texcoord0, a_tangent
$output v_wpos, v_view, v_normal, v_tangent, v_bitangent, v_texcoord0

#include ""

uniform vec4 u_lightPosDir;
uniform vec4 u_lightPosPoint1;
uniform vec4 u_lightPosPoint2;

void main(){

v_texcoord0 = a_texcoord0;
vec3 wpos = mul(u_model[0], vec4(a_position, 1.0) ).xyz;

vec3 wnormal = mul(u_model[0], vec4(, 0.0) ).xyz;
vec3 wtangent = mul(u_model[0], vec4(, 0.0) ).xyz;

vec3 viewNormal = normalize(mul(u_view, vec4(wnormal, 0.0) ).xyz);
vec3 viewTangent = normalize(mul(u_view, vec4(wtangent, 0.0) ).xyz);
vec3 viewBitangent = cross(viewNormal, viewTangent) * a_tangent.w;
mat3 tbn = mat3(viewTangent, viewBitangent, viewNormal);

v_wpos = wpos;

vec3 view = mul(u_view, vec4(wpos, 0.0) ).xyz;
v_view = mul(view, tbn);

v_normal = viewNormal;
v_tangent = viewTangent;
v_bitangent = viewBitangent;

gl_Position = mul(u_viewProj, vec4(wpos, 1.0) );


Input attributes are defined by "$input" statement. Attributes are vertex data that are loaded from fbx mentioned in the model loading chapter. All possible attributes are mentioned in the chapter about file. In the same line, after the input statement, all the attributes that are used must be named. Every used attribute must be included in the file too. More info on attributes is in the specification for file.

Output varyings are defined by the "$output" statement. Compared to attribute inputs, varyings are user-defined fields that also have to be named in the file with their type/precision and semantic information.

The "#include "" statement is obligatory for all vertex and fragment shaders and must be inserted before the main function. The user can write their own additional files which can be included, i.e. for common functions that are used in multiple shaders.

Uniforms are defined with the uniform keyword. Every uniform must have a type. In addition to the user-defined uniforms - model, view, and projection matrices are also available in the vertex (and fragment) shaders as uniforms:

  • u_model - model matrix

  • u_view - view matrix

  • u_invView - inverted view matrix

  • u_proj - projection matrix

  • u_invProj - inverted projection matrix

  • u_viewProj - view * projection matrix

  • u_invViewProj - inverted view * projection matrix

  • u_modelView - model * view matrix

  • u_modelViewProj - model * view * projection matrix

Represents the fragment shader written in BCPLS. The name of the file must adhere to the following rules:

  • "fs_" - fixed prefix

  • "shader_name" - the name of the shader. Must be the same as the parent folder

  • ".sc" - fixed suffix

BCPSL fragment shader example (standar_shader_full):

$input v_wpos, v_view, v_normal v_tangent, v_bitangent, v_texcoord0

#include ""

SAMPLER2D(s_texDiffuse, 0);
SAMPLER2D(s_texSpecular, 1);
SAMPLER2D(s_texNormal, 2);

uniform vec4 u_shininess;
uniform vec4 u_specular_color;

uniform vec4 u_ambientColor;

uniform vec4 u_lightPosDir;
uniform vec4 u_lightPosPoint1;
uniform vec4 u_lightPosPoint2;

uniform vec4 u_colorDir;
uniform vec4 u_colorP1;
uniform vec4 u_colorP2;

uniform vec4 u_intensity;

void main(){

mat3 tbn = mat3(

vec3 normal = texture2D(s_texNormal, v_texcoord0).xyz;
normal = normalize(normal * 2.0 - 1.0);
vec3 viewDir = normalize(-v_view);

vec4 diffuseColor = toLinear(texture2D(s_texDiffuse, v_texcoord0) );
vec4 specularMaterial = texture2D(s_texSpecular, v_texcoord0);

vec3 directionalLightDir = mul(tbn, normalize(;
vec4 directionalLight = CalcDirLight(directionalLightDir, normal, viewDir,
diffuseColor, specularMaterial, u_specular_color, u_shininess.x,
u_ambientColor, u_colorDir, u_intensity.x);

vec3 pointLight1Dir = mul(tbn, normalize(;
float distance1 = length( * 0.001;
vec4 pointLight1 = CalcPointLight(pointLight1Dir, distance1, normal, viewDir,
diffuseColor, specularMaterial, u_specular_color, u_shininess.x,
u_ambientColor, u_colorP1, u_intensity.y);

vec3 pointLight2Dir = mul(tbn, normalize(;
float distance2 = length( * 0.001;
vec4 pointLight2 = CalcPointLight(pointLight2Dir, distance2, normal, viewDir,
diffuseColor, specularMaterial, u_specular_color, u_shininess.x,
u_ambientColor, u_colorP2, u_intensity.z); = (directionalLight + pointLight1 + pointLight2).xyz;
gl_FragColor = toGamma(gl_FragColor);
gl_FragColor.w = diffuseColor.a;


Functions CalcDirLight and CalcPointLight are available through included file. They calculate the directional and point lights for the fragment given the input parameters. These functions are available when writing custom shaders. Function "toGamma" does the gamma correction and is one of the many functions available from the bgfx library - the whole list of available functions can be found here.

All the rules explained in the vertex shader file sections apply when writing fragment shader code.

File in which all attributes and varyings are defined. Every shader must have this file and it must be named exactly "". Below is the example file for our standard_shader_full example:

vec3 v_wpos : POSITION = vec3(0.0, 0.0, 0.0);
vec3 v_view : POSITION = vec3(0.0, 0.0, 0.0);
vec3 v_normal : NORMAL = vec3(0.0, 0.0, 1.0);
vec3 v_tangent : TANGENT = vec3(1.0, 0.0, 0.0);
vec3 v_bitangent : BINORMAL = vec3(0.0, 1.0, 0.0);
vec2 v_texcoord0 : TEXCOORD0 = vec2(0.0, 0.0);

vec3 a_position : POSITION;
vec4 a_normal : NORMAL;
vec2 a_texcoord0 : TEXCOORD0;
vec4 a_tangent : TANGENT;

Both varyings and attributes must have a type/precision (float, vec2, vec3, vec4...) and semantic (POSITION, NORMAL, TANGENT, TEXCOORD0...) defined. The default value is not obligatory. Varyings are user-defined and attributes can be following:

  • a_position - vertex position

  • a_color0 - vertex color (only one color per vertex currently supported)

  • a_texcoord0 - first UV mapping for vertex (two mappings currently supported)

  • a_texcoord1 - second UV mapping for vertex

  • a_normal - vertex normal

  • a_tangent - vertex tangent. When loading .fbx models, if there is no tangent given for vertex one will be calculated.


This file defines all user interface visible shader parameters such as display name, uniforms, textures etc. A complete shader JSON file looks something like this (this is an example from our background blur shader):

"name" : "background_blur",
"displayName" : "Background blur",
"textures" : [
"name" : "s_texDiffuse",
"displayName" : "Diffuse texture",
"stage" : 0
"name" : "s_alphaColor",
"displayName" : "Alpha map",
"stage" : 1

"uniforms" : [
"name" : "u_blurSize",
"displayName" : "Blur size",
"type" : "Vec1",
"defaultValue" : [5.0]

All properties of the JSON objects should be defined:

  • name - shader name by which the system identifies it

  • displayName - Shader name that will be displayed in the UI

  • textures - an array of JSON texture objects. Defines all the textures used in the shader. JSON texture object consists of following properties:name - the uniform name for the texture within the shader codedisplayName - display name for the texture in the UIstage - texture unit(stage) index

  • uniforms - an array of JSON uniform objects. Defines all the uniforms used in the shader. A JSON uniform object consists of following properties:name - uniform name within the shader codedisplayName - display name for the uniform in the UItype - input type for the uniform used in the UI to decide which input control will be displayed to edit the uniform value. Can be one of the following:Vec1 - A scalar value. One numeric input box is displayed in the UI.Vec3 - A vector with 3 elements. Three numeric input boxes are displayed in the UI.Vec4 - A vector with 4 elements. A color picker is displayed in the UI.

Custom shader paths

Once the custom shaders are written, DeepAR Studio must be aware of their location. In the menu choose Assets->Set shader paths. In the following window select "Add new" and choose the root path for all of your shaders. NOTE: You don't choose the shader folder itself but its parent. Organise all of your shaders in such manner. This way all shaders in one root folder can be batch processed. You can organize your shaders in as many root paths as you want.

Add shader root path

Compile shaders

Once the root paths for the custom folder are set, and before they are used, custom shaders must be compiled. Select Assets->Compile shaders action from the menu. In the new window simply press the "Compile shaders" button and wait for the process to complete. The Studio will scan all the shader root paths and compile every shader found there that adhere to the rules given in previous sections. Every compilation attempt and result will be logged in the output console. If the shader cannot be compiled due to syntax errors, user will get the error log pointing to the error. If there are no changes to the shader code from the last compilation, DeepAR Studio will not try to compile that shader again.

Shader compilation

Once compiled all the shaders will be available and ready to use in the Material details panel.

Did this answer your question?