Wk 4: Research

Shaders and textures control how the 3D model will appear visually. In 3D programs, these are applied to the material which is then applied to the model. Sourced from: http://www.informit.com/articles/article.aspx?p=2162089&seqNum=2





Texture refers to the flat image applied to the model that gives it colour, patterns and details (Geig, 2013). Through the process of texture mapping, a two dimensional texture file is laid over the top of the model in order to texture the model. Texturing can range in complexity from simply colouring the model to creating a photo-realistic image (Slick, 2014).

Types of Maps

There are a variety of different texture maps that allows for different outcomes (Reallusion, 2011). Depicted in the image below are some the commonly used maps:

4Sourced from: http://proletariat.com/2013/10/08/building-the-medic/

  • The diffuse map holds the colour detail. This is where a majority of the detailed texture painting will occur (Digital Lighting).
  • The normal and bump maps allow additional shading without having to change the model (Digital Lighting).
  • The gloss or specular map determine the shininess of the object. White areas will be more specular and therefore shinier (Reallusion, 2011).
  • The ambient occlusion (AO) maps are overlaid onto the diffuse map to give soft realistic shading (Digital Lighting).
  • The alpha map determines opacity. This allows for intricate detailing such as hair, as shown below (Reallusion, 2011).

hair (2)

Appearance of Surface

For some surfaces like fur, cloth or rough surfaces with bumps or scratches, it is not feasible to model every single minute detail. Because 3D modelling is such a lengthy process, the artist can use texture maps to “fake” some of the surfaces (McDermott, 2011). The 3D models of the buildings shown below have either a brick surface or smooth surface:


However, when texturing the artist is able to incorporate digital painting techniques into the diffuse and normal maps in order to give the surfaces minute scratches, dents and bumps.


Sourced from: http://www.evermotion.org/tutorials/show/7984/making-of-deserted-village

By using textures to give the illusion of surface, artists are able to create photorealistic models with depth, grunge, opacity and details while also being efficient.



Shaders are a set of parameters that outline how the computer should display the surface of the model. According to Slick (2014), the model’s interaction with light, reflectivity, opacity and glossiness are all determined by the shaders. Typically they are included in the 3D modelling program, allowing the creator to simply tweak the shaders until they achieve their goal (Slick, 2014).

The Appearance of Surface

The large variety of shaders available allows the model to be given the appearance of almost every surface. There are three basic types of shaders, which are described below (Geig, 2013).

Diffuse: evenly distributes light across the objects surface, used for surfaces such as skin.


A diffuse shader is used on a brick cube. Sourced from: http://docs.unity3d.com/uploads/Shaders/Shader-NormalDiffuseDetail.png

Specular: gives the object a glossy or shiny surface, used for surface such as polished metal or water.


A water surface with a specular shader. Sourced from: http://forum.unity3d.com/threads/specular-water.64593/

Bumped: used in conjunction with a normal map, these give the surface physical details such as bumps, indents or scratches.


A bumped shader is used with a specular shader on a brick cube. Sourced from: http://game.ceeger.com/Components/Images/shader-NormalBumpedSpecular-0.jpg

In addition to this, a main colour can be chosen with each shader. This determines the colour of the ambient light that will be shone on the object (Gieg, 2013). Just as a coloured spotlight does not alter a person’s colour, neither does the main colour: it merely lights the object with the chosen colour.


Digital Lighting. Types of Texture Mapping. Retrieved from

Geig, M. (2013). Working with Models, Materials, and Textures in Unity Game Development. Retrieved from

Maya Learning Channel. (2011). Applying Shaders and Materials – Part 1. Retrieved from

McDermott, W. (2011). Faking surface detail for mobile assets. Retrieved from

Reallusion Inc. (2011). Types of Maps. Retrieved from

Slick, J. (2014). Anatomy of a 3D Model. Retrieved from


Wk 3: Research

Ed Catmull: A 3D Computer Graphics Pioneer

Dr. Ed Catmull, currently the president of Pixar and Disney Animation, was one of the founding fathers of Pixar (PW, 2008). He had the drive to animate but, lacking the drawing skills, Catmull focused on computer graphics, which he believed could be utilised to create feature-length animations.

Ed Catmull_May2012

Sourced from: shttp://waltdisneystudios.com/corp/unit/6/bio/53

With a degree in Computer Science and Physics, Catmull has received several awards for his multiple industry achievements (Disney, 2009). His most notable contributions to the field of computer graphics include: the development of the z-buffer and the invention of texture mapping (PW, 2008).

Z-buffering is an algorithm which manages the depth and positioning of assets in a 3D environment: particularly it ensures that an object in the foreground will block (or partial block) the view of an object in the background, therefore behaving as real life objects would (Rouse, 2005). This demonstrated below:


Sourced from: http://en.wikipedia.org/wiki/Z-buffering

Texture mapping is the process by which a 2D image or texture map can be applied to the 3D asset in order to give it a desired look (Bourdon, 2013).


A 3D cube is textured to look like a crate. Sourced from: http://satish3dartist.blogspot.com.au/2010/08/texturing-mapped-model.html

Without Ed Catmull, the graphics industry would not be what it is today.

Democratisation in the 3D Graphics Industry

Democratisation refers to, in this case, making the 3D graphics industry democratic. That is, creating an industry where power, resources and opportunities are spread equally among the people (Dictionary.com, 2014).

Up until recently a divide has existed between the large, professional 3D graphics studios and the enthusiast at home with a computer. However, recent changes in the way digital content is created and distributed have created new opportunities as the industry adapts in order to remain viable in the global marketplace (Estes, 2013). Both software and hardware are becoming more affordable, powerful and accessible; no longer are these exclusive to large companies (CB, 2013). In fact, many graphics programs are now free to install and use, such as: Blender, GIMP, Sketch Up and Unity1 (JPR, 2012).

Created with Blender, Sintel is a 15 minute animation available for free download: https://durian.blender.org/download/

The implications for the industry are as follows:

  • Self-trained artists have the tools and, due to the availability of online tutorials, the abilities to become a viable artist within the industry.
    • This rise of non-traditionally trained talent brings new ideas, creativity and competition into the industry.
    • It does, however, also dilute the workforce, making jobs opportunities even rarer.
  • Independent studios can gain a foothold in the industry, allowing them to reach new demographics and further the marketability and prominence of 3D graphics.
  • With software and hardware available globally, and the ability to communicate online, larger companies can create content across multiple geological locations.
    • This increases creativity within the industry, as new cultures and views will be brought to the table.
    • This drastically increases job opportunities, as one is no longer limited to their town, city or country.
    • Larger companies are once more at an advantage, therefore gaining a foothold above individuals and independents.

(Creative Blog, 2013; Estes, 2013; John Peddie Research, 2012).

Emerging Technologies: Ptex

One emerging open-source technology that could help further democratisation is Ptex. Developed by the Walt Disney Animation Studios, Ptex is “per-face texture mapping system that does not require UV assignment” (Seymour, 2014). Essentially, the Ptex eliminates the process of UV assignment which is time-consuming and difficult (Ptex, 2012). According to their website, Ptex (2012) does so in a way which allows for “any number of textures to be stored in a single file” and for seamless filtering.

ptex-teaser-big (2)

First example of Ptex, with the individual faces displayed (left) and the final textured model (right). Sourced from http://ptex.us/overview.html

When using Ptex, the texture is painted directly onto the 3D model. This saves an incredible amount of time and guarantees no seams (Masters, 2014). Disney has used Ptex exclusively on their last four major works (Frozen, Wreck-It-Ralph, Tangled and Bolt) and the program is now being picked up by many, including Pixar (Seymour, 2014).

Speed texturing using Mudbox and Ptex. Sourced from: https://www.youtube.com/watch?v=dzIfW1dPNHs

The implication for asset creation is enormous. It cuts an entire step out of the production pipeline and saves a large amount of time. This technology allows film-quality assets to be created efficiently and can plug in to industry standard software such as Mudbox (Masters, 2014). Ptex has the potential to revolutionise the creation of 3D graphics for animations and films. However, Ptex is currently not compatible for use in video games as they must be rendered in real-time (Masters, 2014).


1The free version of Unity is available for commercial use as long as the creator/s or company does not exceed $100,000 in profits (Unity Technologies, 2014).


Boudon, G. (2013). Understanding a 3D Production Pipeline – Learning the Basics. Retrieved from

Creative Blog (CB). (2013). The democratisation of computer graphics. Retrieved from

Dictionary.com. (2014). Democratize. Retrieved from

Disney. (2009). Dr. Ed Catmull. Retrieved from

Estes, J. (2013). Is the Demoncratization of Graphics a Good Thing?. Retrieved from

John Peddie Research (JPR). (2012). The Democratisation of 3D. Retrieved from

Masters, M. (2014). Understanding Ptex – Is it the Future of Texturing?. Retrieved from

Pixar Wiki (PW). (2008). Ed Catmull. Retrieved from

Ptex. (20112). Ptex Overview. Retrieved from

Rouse, M. (2005). Z-buffering. Retrieved from

Seymour, M. (2014). Ptex, the other side of texturing. Retrieved from

Unity Technologies. (2014). License Comparisons. Retrieved from

Wk 2: Research

Current trends

The rise of the “indie” (independent) game industry has caused (or been caused by) a shift in 3D modelling. The most relevant change in 3D modelling is the software used to create it. Free programs, such as Blender, Sketch Up and the free version of Unity, have allowed independent or small companies to create 3D models and video games for free, thus gaining a foothold in the industry (JPR, 2012).

Typical 3D Modelling Pipeline for Games

Concept Drawings

Concept drawings are used extensively in order to flesh out the game concepts, aesthetic and feel (Ciszek, 2012).


Early concept work for Sunset Overdrive. Sourced from: http://conceptartworld.com/?p=34097

From these concepts drawings the core shapes, structures and visuals are broken down in order to begin the process of prototyping and developing the assets (Epic Games, 2012).


Refined character concept art for Sunset Overdrive. Sourced from: http://conceptartworld.com/?p=34097


Using the concept art, the modeller creates a 3D model of the character or asset. Programs such as 3DsMax and Maya may typically be used for this. First, a medium to high poly mesh is created and then transferred to another program for sculpting (Epic Games, 2012). Ciszek (2012) notes that “in the game industry, most models are created as surface models”, as opposed to solid models, as both the modellers and game engines can handle them better.


From the concept drawing (left), a 3D model is created for Assassin’s Creed Unity. Sourced from: http://www.pierrebertin.com/search/label/AC3


Sculpting allows extra detail to be added to the model; this is typically done with additional software such as Mudbox or Z-brush (Ciszek, 2012). In this step the artist is able to bring the concept drawing to life by adding fine detail and refining aspects such as clothing and facial features.


The model is refined through sculpting. Sourced from: http://www.pierrebertin.com/search/label/AC3

High to Low Poly

As video games are rendered in real-time, the poly count of the final asset must be low (or lower than the sculpted version) in order for the game engine to handle it (Epic Games, 2012). Converting the mesh from high poly to low poly can be done in a variety of programs such as Z-brush and Maya.


Low and high poly versions of the same model shown side-by-side. Sourced from: http://audreee.deviantart.com/art/Naga-high-and-low-poly-282339160

UV-Mapping and Texturing

Through the process of unwrapping a complete set of UVs are created, ready for texturing (Bourdon, 2013). Texturing may be done in Photoshop and/or used with secondary software such as Quixel. After the texture maps are finished, they are baked into the low poly model; a game character will typically use diffuse, normal, ambient occlusion and specular maps (Ward, 2013).


Before and after texturing. Sourced from: http://cgi.tutsplus.com/articles/game-character-creation-series-kila-chapter-4-texture-baking-building–cg-28262


The process by which the character or asset is set-up for animation (Bourdon, 2013). Consideration for what movements the character will make must be taken into account.


A game character is rigged using Maya. Sourced from: http://cgi.tutsplus.com/articles/game-character-creation-series-kila-chapter-6-basic-character-rigging–cg-31083


By manipulating the game character through their rig, the animator is able breath life and movement into the character (Bourdon, 2013). Just as in feature films, good animations should be more than just the required actions.


GLaDOS, a robot from the Portal games, gains a human-like quality and emotion through animation. Sourced from: http://es.terraria.wikia.com/wiki/Archivo:GlaDos.gif


Boudon, G. (2013). Understanding a 3D Production Pipeline – Learning the Basics. Retrieved from

Ciszek, P. (2012). 3D Production Pipeline in Game Development. Retrieved from

Epic Games, Inc. (2012). Epic Games Design Workflow. Retrieved from

John Peddie Research (JPR). (2012). The Democratisation of 3D. Retrieved from

Ward, A. (2013). Game Character Creation Series. Retrieved from

Wk 1: Research

Typical Stages in a 3D Production Pipeline

3D production is a complex and precise operation. In order to remain effective and efficient 3D production typically follows a standard pipeline which helps streamline the process. On larger productions different stages may be executed by different people.

1.   Preproduction and Blocking

This is the initial planning stage where the final designs and assets are determined. Some productions, e.g. video games, may choose to use concept designs and development sketches to design their 3D assets. The visual development process involves taking the (usually) 2D sketches and concepts and determining how they will function in 3D.

journey_conceptsConcept designs forJourney, a 3D video game. Source: http://conceptartworld.com/?p=17180

In addition to this, the role of the asset, how it will interact (and move) and its relationship with its environment is determined through the process of blocking, a term that most commonly refers to actors working out a scene (Marshall, 2010).


Character concepts for Rapunzel from Disney’s Tangled. Sourced from: http://characterdesignnotes.blogspot.com.au/2010/11/disneys-tangled-character-design.html


2.   3D Modelling of Required Assets

Modelling is the process of taking the asset design and recreating it in 3D computer generated graphics. There are a variety of programs for this, including paid programs such as 3DsMax and free programs like Blender. Modelling involves the manipulation of vertices, edges and polys in order to create 3D shapes and assets (Slick, 2014). They are many different techniques in which to do this.

3d model

3D Model for Rapunzel from Disney’s Tangled. Sourced from: http://xnuccio.blogspot.com.au/2012/12/animation-pipeline.html


3.   UV Mapping

In this complicated and fiddly stage, the 3D asset is “unwrapped” so that it lies flat on the UV plane as a mesh in order for a UV map to be created. The UV plane is a 2D platform in which the U and V respectively represent the traditional X and Y axis (Guerrilla CG, 2009). In order for a texture to be applied to an asset its UV map must lie completely flat on the UV plane. In the example below, a cube is ‘cut’ and is in the process of being flattened (this is stylised for educational purposes).


Sourced from: http://www.chocofur.com/tut_01_e.html

Once flat upon the UV plane a texture or image may be projected onto the cube’s mesh. This is shown below:


Left to right: 3D cube with the applied map; map being applied (or “wrapped”) around the 3D asset; the 3D asset sitting on top of the UV map. Sourced from: http://upload.wikimedia.org/wikipedia/commons/f/fe/Cube_Representative_UV_Unwrapping.png


4.   Texturing

Once the asset has been unwrapped the UV map can then to painted or edited in order to give the asset texture (Bourdon, 2013). This can be done in programs such as Photoshop.


Clockwise: final texture when applied to the 3D model; the 3D model without texture; the flattened texture maps ready to be applied to the model. Sourced from: http://www.thegnomonworkshop.com/news/2013/03/why-a-camera-is-a-texture-artists-best-friend/


5.   Rigging

The 3D asset is then bound to a character rig: a skeleton system of bones, joints and control handles which allow the character to be moved, as a doll would, into a pose (Slick, 2014). When rigging, kinetics and joint hierarchy must be taken into consideration.


A complex character rig for a human model. Sourced from: http://alexnikolaev.blogspot.com.au/2010/10/advanced-rigging.html


6.   Animation

Similar to 2D animation, 3D character animation involves creating a series of poses in a timeline that, when played, give the impression of movement (Boudon, 2013). Poses are created by moving and adjusting the character rig. Other animations, especially VFX such as shattering or exploding, may be pre-programed into the software, allowing the animator to save time (Boudon, 2013).


The principles of animation, including follow through and overlapping animation, are still essential for creating realistic movements (as seen in Disney’s Tangled). Sourced from: https://www.tumblr.com/search/tangled+gifs


7.   Scene Assembly

Similar to blocking or staging in a theatrical sense, scene assembly is the process of positioning the assets in the 3D space and manipulating the camera through the creation of camera paths (CG Architect, 2013). This allows the artist to recreate the storyboards, movement or scene outlined in the initial planning stages (Dreamworks, 2013). The assets may be from multiple different files and varying in size, some being extremely large like, a 3D forest environment, which will cause many computers to struggle (CG Architect, 2013). Scene assembly must therefore be carefully planned and many artists will use secondary programs to help with this stage.

asdf (2)

Rough scene assembly. Sourced from: http://www.dreamworksanimation.com/insidedwa/productionprocess


8.   Lighting

As the name suggests, lighting is the process of incorporating a virtual light source into the 3D scene (Boudon, 2013). This stage needs to be carefully done in order to create the desired effect (Chang). Lighting gives the 3D objects shading and shadows, allowing the scene to appear more dynamic and realistic. However, different assets will need to be lit differently and/or have the settings adjusted: a light source, for example, will shade a metal surface differently to a wooden surface and this must be taken into account (Chang).


Observe the different highlights on the scales, armour, cloth and hair. Sourced from: http://www.moustachemagazine.com/2014/06/at-the-movies-44/how-to-train-your-dragon-2-international-poster-slice/


9.   Rendering

One of the final steps, rendering involves taking the 3D scene, the camera placements and movement, lighting and effects and outputting it into usable files (Boudon, 2013). Different rendering settings will give a different final product; it is therefore important that much time is left for experimentation in order to gain the desired effect for the scene (Bourdon, 2013; Chang).


Front: the rendered image including camera affects and lighting. Behind: the 3D scene. Sourced from: http://area.autodesk.com/3dsmax2011/features


10. Compositing

In this stage, the rendered file (an image or animation) in brought into a compositing program (Chang). This stage includes everything from special effects, to final touch-ups, to the combining and assembling multiple visuals, possibly from different renders or sources (Chang). For example, a rendered character may be placed into a live action scene. Specialised visual effects artists will play a large role in this stage.


The rendered animation (left) has special effects added to it (right) in a compositing program. Sourced from: http://www.motiondesignandcompositing.blogspot.com.au/


11. Video Editing

Lastly, video editing is the final, but still important stage, in the 3D production pipeline. The composited footage undergoes final editing: this includes the addition of audio, sound effects and, possibly, adjustments to camera and framing (Chang). Due to the addition of audio, this stage is extremely vital and quite extensive: the process of the final mix must not be passed over (Dreamworks, 2013).


The process of the final mix for the movie Puss in Boots. Sourced from: http://www.dreamworksanimation.com/insidedwa/productionprocess


Boudon, G. (2013). Understanding a 3D Production Pipeline – Learning the Basics. Retrieved from

CG Architect. (2013). Scene Assembly. Retrieved from

Chang, A. The Process of 3D Animation. Retrieved from

Dreamworks Animation. (2013). Production Process. Retrieved from

Guerrilla CG Project. (2009, June 5). The Basics of UV Mapping [Video file]. Retrieved from

Marshall, P. (2010). The 5 Stages of Shooting a Film Scene. Retrieved from

Slick, J. (2014). 3D Modelling. Retrieved from

Slick, J. (2014). What is Rigging?. Retrieved from