A software framework provides a level of abstraction in writing application code relevant to an environment. Although this may sound like a mouthful, it’s quite simple when you start to unpack it.
When we write code with JavaScript we are utilizing its environment to provide a context for our software and make something that would typically be usable out of our source code. However, as applications become more complex what is considered usable can start to require more development in order to match user expectations. Working with standard or, what developers often refer to as Vanilla JavaScript can result in the development of repetitive routines in order to meet basic user expectations (particularly when starting a project from scratch).
As a result software extensions often build on the noted environment, by adding some level of abstraction and addressing this repetitiveness. This essentially allows developers to start building with the assumption that some lower-level requirements have already been fulfilled by the extension.
Frameworks are software extensions that we add to our application, generally to provide a level of simplification to creating a coding outcome that might otherwise be cumbersome or not even possible to achieve without the extension.
In terms of JavaScript, this environment would generally be constrained to a web or mobile application where the code used for development provisions an interface making commonly utilized functionality and maintenance easier to access and implement.
A JavaScript framework such as Vue provides an interface for simplifying specific repetitive tasks when developing a JavaScript Single Page Application (SPA). Frameworks such as Vue will often have a set of rules or procedures that must be followed in addition to what qualifies for valid JavaScript code to apply the benefits of the framework within your application effectively.
What is Vue
Vue is an open-source JavaScript Framework, developed by Evan You (USA) in 2014. Its primary use is for building web interfaces and single page front end, web applications. The main source code repository for Vue is maintained within a GitHub repository located at
Vue is licensed under the permissive MIT license, with very few restrictions and a reasonable degree of licensing compatibility. You are therefore free to download, modify and redistribute the codebase. As well as utilize a combination of copyright and copyleft licencing for commercial or non-commercial usage.
You do not need to download and include the Vue source code in your program in order to use Vue. The Vue gitHub repository is an uncompiled version of the Vue source code that is primarily for the purposes of maintaining Vue, itself. We will discuss, in more detail, how to include and utilize Vue in your projects a little later.
Supported Paradigms and Architectural Patterns
Amongst, the solutions Vue provides for application development is a multi-paradigmatic approach that builds on the Model-View-View-Model (MVVM) architectural pattern (but not strictly) and combines familiar concepts of Object Orientation Programming (OOP) such as an emphasis on data encapsulation.
Vue is often described as fulfilling the purposes of the “View” within in the Model View Controller (MVC) programming paradigm. Although this may be true, it does not fully describe one of Vue’s most powerful features known as it’s reactivity system which when utilized tends to better describe Vue as following an MVVM architectural pattern.
Primarily MVC and MVVM are intended to decouple an application’s presentation layer from its functionality layer. In other words, we can use Vue to separate the design of our application, at it’s most basic, into two components that being,
What the user sees and interacts with from
what happens in the background after a user interacts with the application’s interface (or presentation layer).
The difference between MVC and MVVM becomes apparent in the emphasis that is placed on this separation in the former, and how this separation is maintained. As such, a developer utilizing the MVC approach will define how the presentation (View) and functionality (Model) layers interface and subsequently update each other.
The Vue approach differentiates in that there is an inherent link between the Model and the View. Updating the model, updates the view and updating the view updates the model. This is largely attributed to what is know as Vue’s reactivity system. Although Vue’s reactivity system will maintain the relationship between the Model and View and update both in realtime accordingly, it is, however, worth noting that there will remain aspects of presentation that need to be updated by the application’s codebase, and not inherently through the reactivity system.
When this is applicable largely depends on matching user expectations and can, for example, be as simple as explicitly adding the code that removes text from an HTML text input field after a user has typed in a value then hit the Enter key. In other words, the data entered into the text input field can effectively be captured within Vue’s functionality and work matching a developer’s expectations however the same data might persist within the User Interface this might not match user expectations. Of course, more complex examples of decoupling and reconciling data within the Model and View exist and would require an in-depth look at User Experience (UX) and User Interfaces (UI).
Vue is often described as combining the best of the popular AngularJS framework (maintained by Google) and the React library (maintained by Facebook). It can be lightweight at approximately only 33KB and is considered by many as easier to learn than it’s counterparts. It is also a Progressive framework meaning that it can be plugged into existing projects as well as combined with other 3rd party JavaScript tools.
In this post, we will dive into the details around setting up HTTPS for your website using a completely free set of tools and also discuss some basic concepts around what it is and why it’s so widely utilized.
What is HTTPS SSL/TLS and why would I need it?
While browsing various sites on the internet you may have noticed that your browser throws a warning on occasion about a site not having a secure connection. You might also think twice about continuing to load the site when the warning prompts you to consider that any information you submit on the site (for example through a web form or other means) could be intercepted by an unintended recipient.
Hypertext Transfer Protocol Secure (HTTPS) provides computers or nodes on a network (or many interconnected networks, like the Internet) with a set of rules they can agree on for securely transporting data between requesting and receiving nodes. This is particularly significant in contrast to HTTP, which does not require data in transit to be encrypted.
When a connection is established between receiving and requesting nodes on such a network, only those nodes should have the key to decrypt that data in transit. This essentially means that even if the data you submitted via a webform on an HTTPS site is intercepted by an unintended recipient it will remain encrypted.
Transport Layer Security (TLS) is a common type of encryption used with HTTPS connections. It is also the successor of Secure Sockets Layer (SSL) and most often provides the Secure component with the HTTPS protocol.
What do I need to get started?
In order to secure your site’s connection with HTTPS you will need an SSL/TLS certificate. These certificates can only be issued by a Certificate Authority (CA) and depending on your needs, CA services can be simple to complex with pricing that scales accordingly.
Let’s Encrypt is a nonprofit, global CA that issues SSL/TLS certificates at no cost in the interests of a more secure and safe Internet.
To get started you will need,
A hosting plan that allows SSH access. This can usually be accomplished with a standard cPanel installation which many hosting providers supply even if you are using a basic shared hosting package.
As previously noted, we will be using Let’s Encrypt as a CA in order to obtain a recognised SSL/TLS certificate that will be valid for 90 days.
We will be using GetSSL to request the certificate and validate ownership of your domain.
As SSL/TLS certificates will expire we will finally set up a Cron job to auto-renew the certificate.
Setting up remote server access with Secure Shell Protocol (SSH)
Although many hosting providers do offer Terminal access as part of a standard cPanel installation, some do not.
As such, we will look into setting up SSH on your local Linux desktop which will allow you to remotely access the server where your site is being hosted.
In order to setup SSH you will first need to generate private and public keys, that when successfully paired will create a secured remote connection to your site’s server.
Under the Security section of cPanel open the SSH Access app and click the Generate a New Key button.
After you have added a password for the keys you will receive a similar output to below. If when you created the pair you also chose to enter a passphrase, you will also need to take note of this passphrase as it will be required to connect remotely to your site’s server via SSH.
Generating public/private rsa key pair.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/username/.ssh/testkey.
Your public key has been saved in /home/username/.ssh/testkey.pub.
Unless you are managing multiple pairs, you would not need to enter a name for the keys as they would be assigned the name id_rsa From this output, it’s important to note where the identification key has been saved. This is your private key and it should not be shared. In conjunction with the password to your server or cPanel account it can be used to compromise access to your site’s server.
You will then need to Authorise the use of the public key by accessing its Management tool.
And if your hosting package requires it, further Enable SSH access.
You will then need to download a copy of your identification key (private key), this can be accomplished with an SFTP client (such as Filezilla) or you could simply use the cPanel File Manager.
Back on your local desktop, paste a copy of the private key in your equivalent, following location
/home/username/.ssh/
You will then need to change permissions for the key such that only you are able to read/write it.
You can now open a Terminal to connect to your server.
If you are using a shared hosting package you will first need to find out the address of the server that is hosting your site as well as the relevant port number.
In cPanel you can find out the server name, under Server Information in the General Information section. However, this does not necessarily equate to your server’s full address and port number.
Alternatively, you could also use your server’s IP address to establish an SSH connection. With your username, IP address and port number the command would resemble the following format,
ssh username@162.123.0.1 -p12345
Where ssh is the command being invoked. username would be the username used to log into your site’s cPanel account @ is required to precede the address 162.123.0.1 is an example of what an IP address could look like and -p precedes the number of the port associated with your site
If you are unable to locate the information required under the Manage SSH cPanel app or from your cPanel Server information section, then you may need to contact your hosting provider for further assistance.
Once you have the required information open a Terminal instance and execute the noted command. If you have SSH installed. it will then try to match the private key on your local desktop with the public key on your side. If there is a match you will be prompted for the passphrase you input when creating the private and public keys (from above).
You will then need to enter your password, this will be the same password associated with the username from the above command, that you use to log into your site’s server (or cPanel).
If you have successfully logged in you will then see that the command prompt has changed to username@servername
Installing and configuring GetSSL
We are now ready to install and configure GetSSL in order to request and install an SSL/TLS certificate from the Let’s Encrypt global CA.
Run the following command via SSH to obtain a copy of the GetSSL installation script and set its permissions,
Use the following command with the create flag to create the default config files, thereafter you will be able to find the applicable files in the same location such as /.getssl/
./getssl -c yourdomain.com
Where yourdomain.com is the domain you would like to install HTTPS for. The below list represents the files and folders now installed.
You will then need to cd into the following directory in order to edit the config file.
It’s also worth noting that there is another config file with the same name that exists at a higher level in the directory structure. If you are setting configs for multiple domains you could use this config file as such, however throughout this guide we are only focusing on the config file within the subfolder /yourdomain.com. All configs, as such, will only apply to this domain.
cd ~/.getssl/yourdomain.com
You will then need to open the getssl.cfg config file within a text editor to modify it. If you have access to vi on Linux you could open it with the following command,
vi ./getssl.cfg
Once in vi you can use h, j, k and l to navigate the editor. Then i to switch to insert mode. ESC to return to navigation mode and :wq to save and quit.
Navigate to the line where you see the following variable commented out
#ACL
Uncomment out the line by removing the # character and replace the location assigned to ACL with a path where your site’s HTML files are located for example,
You will then need to create the directories /.well-known/acme-challenge/ under the above-noted location on your server. This is necessary in order to prove that you have ownership of the resource.
If you would like both yourdomain.com and www.yourdomain.com secured.
Navigate to the line where the SANS variable is defined, uncomment it and assign it as such,
SANS="www.yourdomain.com"
No need to include your primary domain here. Save and exit the config file.
Back in your GetSSL directory via SSH type,
./getssl yourdomain.com
This will verify your ownership of the domain, request certificates then save the certificates and private key with something similar to the following output if all goes well,
Verification completed, obtaining certificate.
Certificate saved in /home/username/.getssl/yourdomain.com/yourdomain.com.crt
Your domain is now successfully staged and ready for the production certificates to be installed.
You can verify if this has been successful as, when you look in the location noted from the previous output you will see the .crt files that were requested during staging.
You will then need to delete these .crt files before proceeding with certificate installation. This is essentially because we need to change the config file and then re-run the getssl script. As the crt’s already exist and were only just requested, attempting to run this script without deleting the existing crt’s will result in the script halting given that the certificates have not reached an expiry period yet.
To edit the config file reopen the file that was previously edited,
vi ~/.getssl/yourdomain.com/getssl.cfg
Scroll down the section where the variable CA is initialised and comment out the staging value, then uncomment the assignment to the production location. NB. The # (hash character) is used to denote comments.
# The staging server is best for testing
#CA="https://acme-staging-v02.api.letsencrypt.org"
# This server issues full certificates
CA="https://acme-v02.api.letsencrypt.org"
Save and exit the file as per normal with vi, by typing :wq <Enter>
You can now run the getSSL script again to request the production-ready certificates.
./getssl yourdomain.com
You should receive a similar output to the first time you ran the script. However, this time you will keep the certificates returned in order to finalise the installation.
Installing the SSL/TLS Certificate
Back in cPanel go to the SSL/TLS app in the Security section.
From within this app, you should have access to Manage SSL for your site
From here you should have access to the Install SSL Website interface
Select the appropriate domain from the Domain drop-down list, then open copy and paste the contents of the following files generated by getSSL into the following locations,
Click on the Install Certificate button, and if all goes well you should now be able to navigate to your site via HTTPS
Auto-renewing your certificate
GetSSL has many useful features including the ability to utilise a Cron job to auto-renew certificates that are about to expire. In cPanel under the Advanced section, click the Cron Jobs app to set up a script that is executed according to a schedule.
As noted Cron jobs run according to a schedule and GetSSL returns certificates that will expire after 3 months, therefore you should not need to request a renewed certificate prior to that timeframe. You can set up your Cron job to run based on your requirements, however from what I have experienced certificates will not renew if they are less than a month old.
The command you could use for your Cron job would require the location of the getSSL script that was fetched via cURL, with the filename and the following flags,
/home/username/./getssl -u -a -q
The -u flag updates getSSL if a newer version exists, the -a flag auto-renews any certificates that are about to expire and the -q flag will only email you if any errors are returning during the job.
Now click the Add New Cron Job button and you should be good to go!
Lets dive into the details on the Minotaur character that has recently been published by RABBITMACHT.
The Minotaur consists of two main 3D components that being the Minotaur itself and the Minotaur’s Armour. As the components are modelled separately they are not interdependent. In other words, you can use the Minotaur with or without his armour and/or extract and use the armour on a completely separate character.
Although these two components also consist of other objects or sub-components, what distinguishes these main components is that each sub-component inhabits the same UV layout within its main component.
Let’s have a look at the main components in a little more detail.
Main Component 1 : The Minotaur Model
The Minotaur Model can further be broken down into several other components including
upper teeth and lower teeth
tongue
left eye and right eyes
As these components typically do not deform during animation (they are only transformed), they can safely be parented to a bone for the purposes of animation.
This is, of course, with the exception of the tongue. As the Minotaur uses Blender’s Rigify system, we fortunately are provided with deformation bones and controllers too for the tongue.
Main Component 2 : The Minotaur’s Armour
The Armour is a more complex main component as it is made up of several objects. Including,
Straps
Shoulder Guards
Loincloth
How these components contribute to the rig during animation, takes on three different approaches.
The simplest approach can be seen with the shoulder guards which are weighted to the Minotaur’s shoulder bones and also to the first upper arm bone for a little extra deformation that assists with the prevention of objects intersecting.
Avoiding Intersections
It’s worth noting at this point that in order to rig the Armour so that it deforms with the Minotaur’s movements and avoids excessive intersections, we have taken a hybrid approach that results in a combination of techniques while keeping viewport interactivity responsive with minimal dynamic simulations.
Bearing this in mind although the straps could be animated with a cloth simulation, this would be overkill given that they are intentionally designed to look and act like a hard leathery material. As a result, the straps are weighted to the armature and follow the Minotaur’s body deformations. One of the key tools, in terms of avoiding too many intersections between the Armour straps and the Minotaur’s body, would be in Shape Keys. Shape Keys or Morph targets can be used to tweak the armour and main body when intersections become visible.
Shape Keys are particularly useful when rendering stills as they add a great deal of convenience, without compromising on the outcome or the scene’s believability.
Finally, the Minotaur’s loincloth is animated by means of dynamic simulation. This adds to the scene’s realism while keeping simulations at a minimal. The Minotaur’s body is set as a collision object and as both objects have a polycount that can be rendered in realtime the simulation can be computed within a very reasonable timeframe.
As a result it’s advisable that you Bake a dynamic simulation to disk before rendering an animation
What does the Minotaur product consist of?
The Minotaur product comes with two main production-ready blend files, a supplementary FBX file with a baked walk cycle and a water-tight, stylized STL file for 3D printing.
Minotaur_for _animation
Minotaur_for_stills
FBX baked walk cycle
STL stylized model for 3D printing
Each file is optimized for it’s specified outcome and the production-ready blend files all have textures in high-res, 4K packed into the .blend file.
Production-Ready Files
The Minotaur for Animation file uses Blender’s Eevee renderer with the Principled BSDF shader. This provides the speed and level of detail required to render animation sequences with reasonable overhead and quality.
The Minotaur for stills file uses a more complex approach to rendering by leveraging on Blender’s Cycles renderer. Both the main Minotaur and Armour meshes have highly customizable shading networks that include Sub-surface scattering (SSS), Fresnel, Ambient Occlusion and many high-res masks for targeting specific parts of each model. For instance, if you wanted to change the minotaur’s veins from green to red and make them glow there are already masks in place to help you achieve that.
Shading Network
Although at first glance the Shading Network might seem complicated, it is in fact very logical and follows a simple design principle. That being,
Create a single shader (for example a Diffuse) that applies to the entire model.
Then minimize the shader’s coverage with a mask and mix in the previous shader that followed the same methodology, through a Mix shader.
Effectively, what this means is that it is really easy to take each Shader’s output and plug it directly into the Material Output’s Surface input to see exactly how each shader affects the model.
If you would like to learn more about how the textures, materials and UV’s are built for the Minotaur, the following post can help you gain deeper insight.
Rigged with Rigify for Animation
Although I have often noted the benefits of a translation-biased rig to many of my students, in this instance, for the sake of simplicity and to curb the learning curve the Minotaur uses the very popular Rigify for animation. As you may already be aware Rigify automates the process of creating a Forward Kinematics (FK) rig which is used for deformation as well as a controller rig that uses various transforms and restraints for posing the FK rig.
Animating with Rigify is intended to be relatively straight-forward and once your animation rig has been generated no special plugins are required thereafter.
The Rigify Metarig has also been included in the production files, giving you the ability to regenerate the rig if you wish to do so.
The minotaur comes equipped with a 40 frame loopable walk cycle. This provides an example of how you would set up multiple animations for your character particularly if you wanted to export it to a game engine.
If you want to learn more about the significance of a Forward Kinematics rig the following post can help you get a better understanding of that. Although controller rigs are essential for making character animation manageable and will generally consist of various hierarchical chains with restraints such as Inverse Kinematics, they will still typically rely on an underlying FK rig to actually deform your character’s geometry, read more below..
Using the NLA Editor
Blender’s Non-linear Animation Editor, not to be confused with an NLE (Non-linear Editor) which is typically used for video editing and is also another tool available within Blender, provides a high level of abstraction relating to animation data. In much the same way that you can rearrange video clips in an NLE such as Adobe Premiere Software, Blender’s NLA Editor allows you to convert animation sequences into clips (also called Actions) and rearrange them in any desired order.
With the rig selected open Blender’s Non-Linear Animation (NLA) Editor. This allows you to push down the current animation to an Action. When exporting animations for a game character you would typically create multiple Actions such as walk, run, jump etc. By using actions the game engine is able to distinguish one sequence from another in a non-linear way. In other words, if you wanted your character to jump the game engine would go straight to the jump action as opposed to first playing the walk animation then the run animation and finally reaching the jump animation.
Once an Action has been created you can still edit it by simply selecting the action in the NLA Editor and hitting Tab on the keyboard. The action’s keyframes will then be exposed in the Timeline, Graph Editor and the Dope Sheet. To create another Action you could return to the NLA select the action you are editing and hit Tab to exit Edit mode. Then simply keyframe another animation. To ensure that the old action does not override the visibility of the new action uncheck it’s NLA track, this effectively mutes the action.
Also included within the Minotaur product is an FBX file demonstrating what an animation baked and exported might look like. You can simply import this FBX into another 3D application or separate it into two main parts one consisting of the rig with the Minotaur and the other consisting of the rig with the Armour. There are many different ways of exporting animations from Blender into game engines including UE to Rigify and UEfy to name a few. However, your use case might require something different. If you have any questions, comment below and someone will be sure to help you out.
Working with Proxies and the Shrinkwrap modifier
The Minotaur comes with an active Modifier stack, this means that in order to get the most out of the file it’s recommended that the file is edited in Blender. The modifier stack actively contributes to the file’s output, it is editable and some components are hierarchically immutable. Understanding the Minotaur’s proxy model setup will assist greatly in reconfiguring the modifier stack.
Find out more about the Minotaur’s modifier stack and how it optimizes the rendering process, both in the viewport and for high quality renders, by means of proxies in the following posts.
High Resolution Sculpt Data
The Minotaur for stills file comes with an ultra-high-resolution sculpted version of the Minotaur and his Armour. You can find the high-res versions within a scene collection post-fixed with HR. These models are made of many hundreds of thousands of polygons so caution should be exercised when trying to render these models.
Their primary purpose is for the displacement of the realtime model’s subdivisions through the Multires modifier at render-time.
In the following post, you can find out more about how the high-res models are constructed through non-destructive as well as Dyntopo sculpting.
A Highly Efficient UV Layout
Whether you are creating characters that will be hand painted or texture mapped by compositing photos an efficient UV layout is crucial to avoid exposing seams. A UV layout that is effective will not only be applicable for your 3D application but for 2D editing as well, as such UV’s should be laid out matching the form of a character as close as possible while still avoiding stretching.
You can find out more about generating an efficient UV layout in the following post.
Where can I get the Minotaur?
The Minotaur is available from the RABBITMACHT store for direct purchase. This comes with a great deal of product support and all minor updates are free.
The Minotaur comes with a standard Royalty-free license, which gives you as much versatility as you can possibly need to use it in your own projects including both commercial and non-commercial, educational or other.
What is a Software Framework A software framework provides a level of abstraction in writing application code relevant to an environment.Although this may sound like a… Read more: Starting out with Vue : Part 1
Although HTTPS has been a standard web protocol for some time many hosting providers still do not necessarily provide support for enabling it “out-the-box”. In this… Read more: How To Secure Your Site With HTTPS For Free
When developing 3D characters it’s important to retain as much of a non-destructive workflow as possible.
This is particularly important with regards to games development as the models used in the final output will often have certain elements baked into texture maps.
Without a non-destructive workflow, modifying a baked a texture map or rebuilding components of your character might be your only option when making changes to your final output becomes necessary. It can be particularly difficult to achieve your desired results as it requires a very indirect approach to editing, thereby compromising on control, detail and taking up more time than necessary.
A non-destructive workflow means retaining as much of the character’s creation history as possible. This means that reverting to different stages in your character’s creation process becomes a lot more accessible. You can then make the changes where necessary (be it at the modelling, texturing, sculpting, rigging or other stages) and automatically allow those changes to propagate throughout your character’s production pipeline.
Although setting up characters in this way might require a bit more planning, initially, once you get into the hang of it and start seeing the benefits of how much time it can save you in the long-run the technique will soon become a necessity in your character development toolkit.
Setting up a reference
Starting with references is typically advisable. They could either be 2D images or, as in this case, we are using a 3D model of a wraith-like-character.
Cleaning the reference
This reference model will not be used in the final output as it simply forms the starting point within the modelling process. This reference model also consists of a great deal of non-manifold geometry which would make attaching it to an armature unpredictable.
As the character we are building is designed to be more streamlined and agile we will not need a lot of the accessories attached to this model.
With the reference model selected simply,
Go into Edit mode,
Select a single vertex from the model’s component you want to delete.
Hovering over the 3D Viewport, hit L on the keyboard
and Blender will select all connected vertices.
This makes deleting whole portions of the model much easier.
MakeHuman is an open-source application for generating human-like 3D characters.
Primarily we use MakeHuman as it gives us a base mesh with edge loops that deform really well during animation and an efficiently laid out UV map that maximizes on 0 to 1 texture space.
The MakeHuman model is then exported as an OBJ file and imported into Blender. The OBJ file format has the added bonus of retaining a model’s UV layout.
Line the MakeHuman model up with the reference model. When modifying the models, avoid transforming the models at the object level. In other words, when scaling the model to match your game engine’s units do so in Edit mode. Select and scale the model’s vertices towards that of the reference model. This ensures that translations and rotations for the game model remain at 0 while scale for all axes remains at 1. This results in more predictable, in-game behaviour for your assets.
When creating content for the Unreal Engine you should ensure that your scene units are set to metric with a value of 0.01
It’s worth pointing out that currently, we are setting up the reference model. The previous image depicts the reference model with its original arms removed and replaced with the make human models lower arms and hands. The original model has also had half of it deleted then mirrored on the remaining half.
This symmetrical modelling technique is not advised for the model that will be exported to UE4 as it will result in overlapping UV’s. However since we are going to discard the reference once we have applied its vertex positions to our export model, use of the mirror modifier is perfectly acceptable in this case and will result in no side effects on this character’s pipeline.
Modelling The Base Mesh
Once the reference is completed it’s time to shift our focus to the main model (Base Mesh). We’ll start with modelling as this is the only stage in the character’s pipeline with destructive properties.
The Base Mesh will be set up for the non-destructive workflow and ultimately exported into our game at various Levels Of Detail (LOD’s).
Again, we start by importing the same MakeHuman OBJ model and lining up to our reference that we recently modified.
Although it will be helpful lining up the two models as close as possible in some cases this won’t be necessary as certain components from the reference will not be required for the final output.
You can also save yourself some time by avoiding lining up your Base Mesh with the reference when parts of your Base Mesh need to be re-modeled. For Example, the eyes and eye sockets as well as the toes.
Regardless of whether your character is being exported for a game or not, working with efficient geometry that is less taxing on system resources is generally advisable.
As a result, select the components of the Base Mesh that are not required such as the Eyes, Eye sockets, Ears, Toes and any other unnecessary geometry and delete their faces.
It’s worth noting that this operation is destructive, in that we are permanently modifying the Base Mesh.
The main side-effect of this step is that it will typically result is non-manifold geometry on the Base Mesh. Correcting this would be far simpler and quicker than modelling or retopologizing the Base Mesh from scratch, particularly with the use of Blender’s Grid Fill operator and UV Stitcher.
Your completed Base Mesh, at this stage, should not comprise of non-manifold geometry and have as few UV islands as possible. By retaining and leveraging on the MakeHuman model’s topology and already existing UV layout you will be able to save yourself a great deal of time and effort.
Blender’s Shrinkwrap modifier is used to match the vertex positions of the selected object to the shape of the targeted object, by displacement.
The target in this case is the reference model and the selected object would be the Base Mesh.
Much of the non-destructive nature of this workflow relies on the Shrinkwrap modifier and as you will see this modifier will be utilized at different stages throughout character development.
A Rig is not a tool that is exclusively reserved for animation
In fact, it is often necessary and efficient to use a rig for modelling. For example, when trying to match your Base Mesh to your reference model a common problem is that they may be in different rest poses. One might be in a T-pose while the other is in an A-pose.
In this case, trying to transform vertices manually and avoiding the use of the mirror modifier (for the same reasons noted above) would be an ineffective and inaccurate solution.
Setting up a quick Forward Kinematics (FK) rig for your output model then matching its pose to your reference model’s pose would be far more efficient and accurate. As this technique is non-destructive it would also be possible to set up and use your final rig at this stage too.
Destructive and Non-destructive Sculpting
Sculpting is an important part of the character development process as it provisions a stage to add necessary details to your model and bring your artworks to life.
It’s important not to use a destructive sculpting technique on the Base Mesh so as to retain the UV’s and vertex groups that would have resulted from UV projecting and rigging (in previous steps).
At first, we are not utilizing Dyntopo but, simply displacing the existing vertices of the Base Mesh to be slightly above the surface of the reference model.
It’s not ideal to make large scale adjustments to the model’s form at this stage. This would both differentiate from the reference and could cause unpredictable results with the Shrinkwrap modifier. As a result, you may want to avoid the use of tools such as the Grab brush.
Once you are satisfied with the general look of your Base Mesh and how it targets your reference model, duplicate the Base Mesh with Shift-d in Object Mode.
This image depicts the duplicated Base Mesh slightly offset from the original Base Mesh, however, this is just for demonstration purposes. As previously noted it’s advisable not to translate any of the models at an object level for the purposes of this setup and exporting.
To ensure that your objects remain in the same position, you could lock the object’s transform properties within the 3D viewport to be certain that you don’t inadvertently move the duplicate or the Base Mesh.
Hide the Base Mesh, select the duplicate and enter Sculpt mode.
Once in sculpt mode you can now enable Dyntopo. This will give you the ability to sculpt with the resolution and precision matching the requirements of your Normal map.
The only limitation at this stage when sculpting again relates to avoiding the application of large scale form adjustments. However, this should not be your focus as essentially you are working towards creating a Normal map and large scale adjustments to your character’s form should not be necessary at this stage.
Baking A Normal Map
In order to keep your character’s poly count to a reasonable amount, a Normal map will be required to recreate the details, applied through Dyntopo sculpting, within the game engine.
To create a Normal map for your character, exit Sculpt mode and select the Base Mesh. Apply a Multires modifier followed by a Shrinkwrap modifier to the selection, this time target the duplicate sculpt model for the Shrinkwrap. The order of modifiers is important.
Subdivide with the Multires modifier as many times as is required to recreate the sculpted details.
Apply the Shrinkwrap modifer at the highest level of subdivision to ensure that you are able to Bake Normals correctly.
When it comes to exporting the character for your game this setup will also give you the ability to export multiple Level Of Detail (LOD) meshes.
Once you are satisfied that you have enough subdivisions within the Multires modifier, set your character’s Display Subdivisions to 0.
Switch your Renderer to Cycles.
In the Render Properties Panel, under Bake choose the Bake from Multires option and set the Bake Type to Normals. You can now bake your character’s normal map.
Retain all the components that went into this setup and you will have a non-destructive approach to sculpting and generating a Normal map for your character at any stage in it’s production pipeline.
Painting a Color Map
It is often desirable to work simultaneously on a Color map and Normal map for many different types of game and other character types. As one map will influence the other, retaining a non-destructive workflow can provide the most effective solution to this challenge.
Switch to the Eevee Renderer and with your Base Mesh selectedsetup a Shading Network that utilizes the Normal map you just created. Now when you switch to Texture Paint mode you will be able to see how the Normal map effects your model while you paint on it.
If you want to make adjustments to your Normal map. Simply repeat the process.
Hide the Base Mesh
Select the duplicate
Enter Sculpt mode
Sculpt the duplicate
Add a Shrinkwrap to the Base Mesh
Apply the Shrinkwrap at the highest resolution
Use cycles to bake a normal map
Paint some more.
Exporting to Unreal
While working on your character it’s advisable not to wait until you have completed the character to test it within the game engine. By using Epic’s SendToUnreal plugin, not only can you easily add your character to your game but also adjust and modify the character within Blender and see the results instantaneously in the engine.
One of the great features of a non-destructive workflow also means that when exporting the models to a game engine you are not committing to anything that will break your workflow. In other words we can sculpt, paint, animate and export our model, check it out in the game engine then revert to Blender to tweak and update it.
This becomes particularly relevant when we use Unreal Engine in conjunction with the SendToUnreal plugin for Blender.
Through this plugin we simply setup our Base Mesh and rig according to the plugins basic collections requirements, then export the character.
The character is readily available within the game environment with the added benefit that we can modify the character as above, and see the changes in the game engine update instantaneously.
Diagram Illustrating Non-destructive Workflow for Character Development
What is a Software Framework A software framework provides a level of abstraction in writing application code relevant to an environment.Although this may sound like… Read more: Starting out with Vue : Part 1
Some industries have even experienced growth in the time of the Corona Virus Pandemic. In particular, if you work in the e-commerce industry you might have encountered many client’s growth last year with regards to sales of various product types such as groceries, clothing and electronics.
Now, that doesn’t mean it’s time to rethink your vocation in hospitality for web development but it certainly does mean there are some fundamental shifts that many consumers have undertaken and that the potential for residing to a dismal outlook for 2020 in terms of commerce can diminish if you are able to adapt quickly to these new behavioural shifts.
One of the most obvious shifts in commerce has been towards online shopping. Yes, I know, online shopping and e-commerce are nothing new but certainly, for many developing nations the shift towards this platform has brought with it many changes in consumer behaviours and according to the United Nations Conference for Trade and Development these behaviours are unlikely to shift back.
With social distancing playing an important part in reducing the spread of the virus, this behavioural shift has translated to many consumers that previously were apprehensive about online shopping. Many of these buyers have now converted to relying on remote shopping as a primary means for providing essential items as well as luxury goods, particularly when the Christmas and the festive season of 2020 began and so too did greater enforcement of social distancing measures for several nations. For these customers being forced into overcoming their fears of online shopping, new-found confidence in e-commerce is arising and showing no signs of diminishing even when lockdown restrictions are eased.
Many larger retailers have already doubled-down on their online sales, resulting in averting drastic potential losses while smaller companies which may not have had the capital or resources to make the transition quite as quickly have incurred a great deal of loss and even succumb to closure.
The Starter Store for Curated Merch
Some of the things I really miss doing with my family pre-pandemic is visiting local markets, rummaging through second-hand stores and finding peculiar items at a café that doubles up as a curio shop. Sadly, a lot of these sellers have faced some very difficult times while others have closed their businesses permanently. It is with these individuals in mind that I decided to create the Starter Store for Curated Merch.
The Starter Store is an online platform for merchants with a personality that reflects in their goods.
It’s easy enough to start your own online store as there are many platforms out there for sellers, however, if this is something you have tried before and abandoned the project you would not be the first. Many sellers entering the online market space can easily become overwhelmed by the steep learning curve required to materialize the perfect store and often have to revert to expensive developers. Another common hurdle for new online shop owners to overcome is how to connect the right customers with your products, eventually leading towards expensive advertising campaigns with little or no returns.
The Starter Store for curated merch is not a cookie-cutter, online shopping platform, nor is it a platform to leverage taking on Amazon or other multi-vendor stores. As you can imagine from its namesake, it’s about starting up a store and building it up over time through a scalable and secure platform.
How the store scales is entirely up to you, as you work closely with a personal consultant that advises you and works towards harmonious implementations of technologies with a seller’s personality. You upgrade your store when you are ready to do so and you control your store’s inventory at your own pace.
And if you want to do it all yourself, you will have all the technical essentials at your fingertips and all you need to do is add the store’s content and start selling. It’s entirely up to you!
With the Starter Store, you never need to feel overwhelmed if you are not technically-minded, we take care of all the heavy lifting for you so that you can focus on what you love, getting your products in the hands of your customers.
If you don’t already have your online store up and running there’s never been a better time to get started selling online. Not only will this help with regards to your site getting indexed by search engines sooner and thereby give you a greater opportunity of being found by your customers but it’s also worth taking into consideration that a far greater number of people are currently shopping online than ever before and that this uptake is set to continue, particularly in developing nations. As an online store owner, you are not restricted by physical distances in terms of reaching customers across the globe.
One of the many great qualities of the Starter Store is that you can build your store from just a few products to as many products as you want overnight! You also don’t need to worry about hosting, or domains nor the technical hurdles to overcome when setting up an e-commerce site as we have all of that covered for you. Each Starter Store comes with a package that lasts an entire year or as long as you want.
Don’t get side-tracked with coding
The Starter Store is integrated with all the right technologies to give you the edge in acquiring new customers without you having to write a single line of code.
With Google Analytics integration you will be able to aggregate the data needed to ensure that you are providing your customers with the products they are looking for.
If you’ve never considered SEO (Search Engine Optimization) nor analyzed website traffic before, not to worry we can keep a firm eye on your traffic for you and advise on the best practices for increasing your site’s engagement.
Receiving and refunding payments is also a breeze with the industry-leading, payment gateway, Paypal. You also have the option to accept payments via Credit or Debit cards already integrated into your store. No clumsy card-swiping machines! Simply get paid instantly, and you will receive an email with all the details to fulfil your customer’s orders.
If you are like us and always thinking about your own and your customer’s security, the Starter Store has you covered there too. Your site is delivered with the new standard in web security using HTTPS with end to end encryption and a digitally signed SSL certificate. As a shop owner, we feel that you should never have to be concerned about securing your customer’s data, you focus on selling and we’ll take care of the rest.
Watch the video below for a brief introduction to the beautifully responsive and intuitive interface of The Starter Store and get in touch if you have any further queries or simply just get started!
Throughout the game development process, you will need to work with assets that have been generated outside of Unreal. It stands to reason that if you are using Unreal Engine for your game that you would likely want to utilize it’s flagship 3D rendering capabilities. As such, working with a 3D Content Creation suite like Blender will form an integral part of realizing your game. However, the transferal of digital assets from Blender to Unreal might not be as straight forward as you had hoped. In fact, the reality is that this need not be the case any longer. In the past year or so leaps and bounds have been made in getting these two software environments to communicate almost seamlessly with each other.
And FBX for all…
Although the FBX file format alludes towards digital justice, if you have been using the file format since about the mid-20-teens you’ll be well aware of its idiosyncrasies which prompted the advent of wide-ranging support for other file formats within Blender with regards to the interchanging of digital assets. As such, you might have experimented with Collada (.dae), 3D Studio (.3DS), Wavefront Object (.obj) to name a few newer and older options but of course, all of these interchange formats have their pros and cons. Where you might win with one feature, another feature could be compromised in that particular file format. The result of this being that there would typically always be some degree of editing required within the target environment in order to get the asset to match the source.
That’s not to say that the FBX format is a one-stop solution for all of these issues, however, we can be assured with more certainty that in interchanging data between Blender and Unreal that our outcomes are more predictable. A considerable amount of work has been dedicated to getting the FBX file format to work from Blender. A commendable effort, particularly given that the file format is owned by Autodesk and their documentation on the subject has, in some instances, been somewhat scarce.
In this post, we are going to utilize a Blender Add-on that is currently in development at Epic Games that automates the export/import process required for the interchange of 3D digital assets between Blender and Unreal.
As previously mentioned you will need to first link your GitHub account to the Epic Games GitHub account. If you are unfamiliar with this process you can read more about it in a previous post. Without making this connection between accounts you will not be able to access the GitHub repository for the Add-on.
Alternatively, if you just want to test the addon you can download version 1.4.13 here
As noted though, this version will likely become outdated quickly as the Add-on is heavily under active development and you will benefit a great deal more from using the latest version of the addon. Therefore it’s recommended that you follow the steps for linking your GitHub account to the Epic Games developers GitHub community before continuing.
Installing the Add-on
You will need Blender version 2.83 in order to install the Add-on. Although it will work with older versions, Blender 2.83 LTS is the version that the Add-on targets.
From Blender’s main menu click Edit > Preferences, then click on the Add-on button. There are various ways of installing Add-ons in Blender, this is, however, probably the simplest.
Click the Install Add-on button and find the zip file you downloaded to complete installing the Add-on.
Blender Add-ons are typically a collection of Python scripts, that extend Blender’s functionality. They might also sometimes be referred to as plugins (unofficially).
Configuration for Blender and Unreal
Once the Add-on is installed click the checkbox to activate it. You will notice that a new menu item appears in Blender called Pipeline as well as some new collections appear in your scene. We will discuss these options a bit later. For now, let’s turn our attention to the Add-on’s preferences.
From within the Preferences dialog box for Send to Unreal you will notice four sections Paths, Export, Import and Validations.
The Paths section is used to specify the location of the assets you are exporting from Blender, when they are sent Unreal.
The Export section can be used to set specific FBX settings for the assets being exported.
Use the Import section to control how the asset is imported into Unreal, for example you can choose to automatically launch the Unreal FBX import settings dialog from here.
Validations can be used to halt the export/import process in the event that certain criteria have not been satisfied. For example, if the asset has broken links to missing textures.
Once you have configured Blender to use the Add-on, launch the Unreal Editor and open your project. The following configurations work on a per-project basis and not on the application as a whole.
From your projects Edit menu click the Plugins option.
From the Plugins dialogue box search for Python and locate the Python Editor Script Plugin. Click the Enabled field and Unreal will ask you to restart the Editor.
Go ahead and restart the Editor.
Once the Editor has restarted, the new Python plugin will be activated. In order access, certain settings for the plugin go back to the Edit menu and click on the Project Settings option.
Search for Python Remote Execution and click on the Enable Remote Execution field in order to turn it on.
Live Export/Import
When you installed the Send To Unreal Add-on a new Pipeline menu as well as several new collections were added to your Blender scene.
The collections include Mesh, Rig, Collision and Extras. We can use these collections to determine what assets are included in the live workflow between Blender and Unreal by adding the applicable asset to the appropriate collection. For example, we are not interested in the additional widgets and rig within this blend file and only wish to export a static mesh of the rhino.
With the rhino mesh selected hit m on the keyboard to move the mesh to a specific collection. As you can expect the Mesh collection is chosen to add static meshes to the live workflow.
Once your collections are organized you are ready to export your assets to Unreal. From the main Blender menu click the Pipeline menu and choose Export > Send to Unreal
If you have followed the previous steps to configure Blender and Unreal, Blender will look for an Unreal process running in the background. This is with regards to the processes that are being maintained by your operating system. The workflow is the same for both Windows 10 and Ubuntu 20.04. You do not need to configure the Add-on to find the Unreal Editor executable.
Once exporting has completed from Blender, open the Content Browser in Unreal and navigate the directory structure you determined from the Add-on’s preferences in Blender.
Once you have located your asset, you will then be able to use it like you would any 3D asset in Unreal including turning it into a Blueprint.
What is a Software Framework A software framework provides a level of abstraction in writing application code relevant to an environment.Although this may sound like a mouthful, it’s quite simple… Read more: Starting out with Vue : Part 1
Although HTTPS has been a standard web protocol for some time many hosting providers still do not necessarily provide support for enabling it “out-the-box”. In this post, we will dive… Read more: How To Secure Your Site With HTTPS For Free
If you’ve ever wanted to make your own 3D game but felt overwhelmed by the prospect of having to learn how to code the complexity of an interactive game system, then learning how to use Unreal Engine’s Blueprints system might be the solution you’re looking for.
For many decades the C++ programming language has been a particularly favoured choice in games development. The language is considered to be a mid-level programming language, as such developers benefit from very readable language syntax, scalable and maintainable paradigms and other high-level programming language constructs. At the same time, developers have direct access to manipulating system resources via memory management, which is something that is typically reserved only for low-level programming languages these days.
The aforementioned reasons coupled with the ability to develop simple to complex infrastructures makes C++ an easy choice when it comes to developing systems that require media management, AI (Artificial Intelligence), physics and rendering engines to mention a few of the requirements within games development.
However, for some time several visually-based code-creation and editing-systems have appeared in the light of attempting to abstract the complexity relating to developing these interactive systems. This is particularly relevant for artists and content creators that are perhaps not as concerned with the kudos acquired from tweaking a function to get a nanosecond of a performance boost or even perhaps developers that wish to prototype an idea quickly.
That’s not to say that these are the specific use cases for learning Unreal Engine’s Blueprints, in fact as we will learn Blueprints have wide-ranging capabilities, that make the system a comprehensive choice in developing many different types of games as well as providing the ability to extend a game system with C++, when necessary.
In a previous post, we had a look at developing an asset in Blender then importing it into UE4 as a Blueprint. Although we covered the basics of creating a Blueprint, we did not dive into attaching any custom interactivity to the asset. In this post, we’re going to pick up from where we left off and dive a little deeper into what the Blueprints visual scripting system is all about.
In order to add interactivity to your game, some form of coding is required. Whether you create that code by hand or use a system that does it for you, code is necessary to drive the interactions between game elements which result in meaningful outcomes. Bearing this in mind, regardless of whether you are an artist developing assets and using a node-based editor to generate your scripts or a developer looking to achieve results quickly, having a top-level overview of some basic coding concepts and how they apply to Blueprints will certainly go a long way towards a greater understanding of what makes your game work. Ultimately, this can also go a long way towards fixing problems within your games when they don’t work as you were expecting.
When starting a new project within the UE4 editor you have the option of choosing a Blueprints or C++ based project. In fact, Blueprints are a visual representation of the underlying C++ code that drives the logic of your UE4 game. The reality is that you do not need to settle on one, at the expense of not being able to use the other. Many games use both C++ and Blueprints quite effectively, together.
So when would you use Blueprints and when would you use C++? You can develop an entire game only using Blueprints, as it is a very robust and extensive platform for visual scripting, you can, therefore, expect that some level of commitment is required to utilize it efficiently. C++ is often used to create game elements that act as building blocks for your game and as you might gather from the term “building block” these elements provide core functionality or statefulness that is referenced with some degree of regularity. It is also not uncommon that aspects of these C++ game elements will be exposed within the UE4 Editor through a Blueprint interface, thereby providing content creators or non-programmers access to core game elements by means of a visual scripting language.
Game Outcomes with Blueprints
Our objectives for this phase of developing our game are quite simple as the main outcome remains focussed on a basic introduction to the Blueprints visual scripting system in order to add interactivity to elements within our game. We will continue from our previous post on developing the StarPickUp asset, the outcome from the perspective of the player is that when the character passes through the StarPickup it should disappear. At this point int time, a relevant value is added to the player’s score and this is reflected on the screen, during gameplay. At present, when the player attempts to pass through the Star pickup they are prevented from doing so. This is the default behaviour of UE4, that being the StarPickUp Actor has an invisible collision box surrounding it which is preventing the player (Pawn) from passing through it.
Our process for adding the required interactivity follows,
Detect if the Pawn is colliding with the Star Pickup Actor
If so, add a corresponding value to the Pawn’s Score
Destroy the Star Pickup Actor
Update and display the Player’s score
Transferrable Knowledge in Working with Blueprints
In order to add the necessary functionality to our Blueprint open the StarPickup’s Blueprint editor. We are going to start by creating some very basic behaviour, that will allow the player to pass through the Pickup. At that point, the Pickup will be destroyed (removed from the game).
Open the Blueprints folder in the Content Browser and double-click the StarPickUp asset to open its Blueprint editor. Bear in mind we are not editing the Static Mesh asset directly, which would typically be located in the Meshes directory within the Content Browser. The Static Mesh actually forms part of the StarPickUp Blueprint Class.
Once inside the Blueprint Editor, select the Static Mesh (StarMesh) in the Components panel (on the left-hand side of the Blueprint Editor Interface).
There are various types of Blueprints that we can create but a Blueprint Class (also simply referred to as a Blueprint) will be the most common we will use throughout this series and for many other game projects. If you are familiar with the Object-Oriented programming paradigm the term class is used very much in the same context.
You can think of the class that we are currently working on as something that retains all that is necessary in describing how the pickup will eventually work in the game world. Once we have completed work on our class and we are ready to use it in our game, we will drag copies of it into the 3D viewport from the Content Browser. In fact, the objects (or Actors) that we place in the 3D viewport can also be referred to as instances rather than copies. They are instances because they inherit all of their meaningfulness, and whatever is required to make them work as expected from the class that we designed and any changes we make to the class will be reflected in all of its instances.
Depending on how you design your class, the instances of the class can have various different properties for example each StarPickUp that is instantiated from the class will have a different position. You could even design them to have different colors or different values equating to higher or lower scores when the Player passes through them. So although they all come from the same class, their purpose within the gameworld might differentiate.
As you can imagine, bearing this in mind, the visual scripting language’s namesake, Blueprint, is no coincidence. When we create classes we are effectively creating blueprints that describe how the objects that we use within the game world will work and interact with other game elements.
If this concept is somewhat difficult to grasp you could think about it in the context of a blueprint for a building. The blueprint contains all of the necessary information for creating the building, however, the blueprint itself is not something you could live in. It’s simply there to describe the possible outcomes. When you create a building from the blueprint, that becomes the useful object, in the same way, that we instantiate Actors from the blueprint class in UE4 and place them in the game world. However, it’s also worth remembering that not all objects are necessarily equal. For example, using the same blueprint one building could be used as a home while another could be used as an office.
Although C++ did not have the first implementation of Object-oriented programming, the language certainly has done a lot to popularize the programming paradigm as we see many different high level languages supporting it. You certainly don’t need to understand Object-Oriented Programming (OOP) to work with Blueprints, but if you ever wish to take things a little further by integrating your blueprints with custom C++ a basic understanding of OOP can certainly go a long way.
Collission Detection and Reaction Setup
In the Details Panel (on the right-hand side of the Blueprint Editor) look for a section called Collision. You will find a drop-down list for Collision Presets. These presets setup quick configurations when a collision occurs with the selected component.
Select OverlapOnlyPawn from the options. This setting will allow for actions to be triggered when the Pawn collides with the StarPickUp Actor.
Below the drop-down list you can see a set of options that are effected by the choice you have made. You can also choose Custom to determine your own configuration.
As no physics are required in order to trigger the necessary action that will destroy the StarPickUp that the Pawn is colliding with you will notice that the collision is enabled with a Query only. This can save some valuable computation resources.
In the Details Panel, scroll down to the section called Events and click on the + (plus button) to modify the Event Graph for the On Component Begin Overlap event.
You will then be taken to the Event Graph (in the middle of the Blueprint Editor interface). The Event Graph is where the concept of the visual scripting interface really comes to life. The Event Graph represents various events that are triggered during gameplay and therefore provides a visualization of much of a game’s interactivity.
When the Event Graph is loaded the On Component Begin Overlap node will automatically be added. This is a result of entering this interface through the Events section of the Details Panel (as previously noted).
Understanding Nodes
Nodes provide the core visualization of data that forms the scripted element of a game. They can be made up of various types of data, perform various functions and can be used to construct countless programmatic statements. In Unreal Editor you would typically access Nodes for creating and modifying Blueprints through the Graph Editor within a tab such as the Event Graph (which we are currently using) or the Construction Script.
The Graph Editor (within the Blueprint Editor) is used to edit the Nodes that form the currently selected element’s Event Graph. The Event Graph is a node-based visualization of the code that will typically be executed for the currently selected element. For example, when gameplay begins the Nodes within the Event Graph will be executed for the StarPickUp Actor, thereby defining that Actor’s interactivity within the game world. How the Nodes are executed, will depend on how you have constructed the graph.
Constructing an asset’s Event Graph can be very straight forward or exceptionally complex however there remains some basic features of the Event Graph that are consistent across simple to complex systems.
The Event Graph for an asset consists of Nodes
Nodes are typically connected to each other
How Nodes are connected to each other will depend on what you are hoping to achieve but again there are some very basic rules that apply across all systems and thereby make it somewhat easier to learn how to create an assets Event graph.
Nodes are represented as a block with a heading (the name of the node) and one or many pins. The pins of a Node will be of various colors but there are essentially only two different types of pins.
Executable Pins
Data Pins
Both Executable and Data pins can be either an Input or Output pin. Whether the pin is an input or output does not change the type of pin it is, it simply determines how data enters and exits a node through it’s available pins. All input pins are aligned to the left side of the node and all output pins are aligned to the right of the node. You might have noticed that the On Components Begin Overlap node only has output pins, nodes can consist of either or both (depending on the node in question).
Executable Pins
Bearing this in mind, a Node’s Executable Pins represent this concept implicitly. Executable pins appear on a node as somewhat arrow shaped and it is this arrow that points towards the order of execution.
You can connect node’s Executable pins and thereby determine the order of a scripts execution, by clicking and dragging the output executable pin of one node and dropping it onto the input executable pin of another node.
Data Pins
However, what about when you want to pass data as a result of a node’s execution from that node to another node? That is when you would need to use a Data pin in conjunction with an Executable pin. How you pass data from one node to another matches the same sequence as connecting executable pins, that is, to connect the output of one data pin to the input of another data pin on another node. However, when making the connection between data pins there are several other factors to take into consideration.
Not all data pins are compatible with each other. Data pins are all of a particular type, and it may not be possible to convert data of one type into another. A pin’s different data types are all color coded and it is safe to assume that in most instances connecting an output pin to an input pin of the same color type will result in a valid connection. Some of the data types we will use most often follow,
Red Pins – represent boolean data which will typically have a value of true or false
Light Blue Pins – are of type integer, otherwise known as a whole number (0, 1 , 2, 3 etc)
Green Pins handle Float values or numbers with a decimal point (3.174 etc)
Pink Pins are used for Strings, in other words, literal characters that don’t need to be evaluated within mathematical expressions in order to return a value. Often strings will contain letters of the alphabet for example “This is a string value” and so it this “1234abcd”.
Of course, there are other data types, but for the purposes of what we are setting out to accomplish an understanding of these four will be more than sufficient for now.
Destroying the StarPickUp
There is more than one way of creating and connecting nodes within the Graph Editor. By clicking and dragging anode’s output executable pin a wire will appear. These wires represent the links between data and executable pins. When two nodes are connected they are said to be “wired”.
When making a connection in this way the node you wish to connect to does not need to exist prior to dragging the output pin, simply drop the pin on an empty area of Graph Editor and a context sensitive menu will appear. From this menu you can search for the node you wish to create a connection with, by it’s name. In our case the node we would like the executable pin to connect to is the DestroyActor node. Type in the name of the node and select it from the list of options.
The Editor is smart enough to know that you intended to create a connection between the output executable pin of the On Component Begin Overlap node with the the input executable pin of the DestroyActor node. It will subsequently wire the nodes correctly for you.
Once you have the Event Graph for the StarPickUp created, Compile and Save the changes. Then close the Blueprint Editor and test your game. Now when your Pawn collides with the StarPickUp, it disappears and the Pawn is free to continue running.
A Network of Nodes
Although we are moving in the right direction, we are not quite there yet as we currently have no score system that is able to track how many pickups our player has collided with and, as a result, what the player’s score is.
In order to accomplish this we will need a slightly more complex network of Nodes to replace the Event Graph we currently have associated with our StarPickup. Don’t worry though, because although the setup we will be replacing our existing Event Graph with is more complex learning the logic and how to apply this understanding is fundamental to developing many different types of interactions for games. As a result, if you can understand the logic you only need to learn it once then adapt this approach to developing a variety of different types of interactions within your games.
There are a couple of tasks we will need to perform when creating our new node network,
Add a variable to our pawn
Access/Get this variable from the StarPickUp actor
Manipulate the value
Then return/Set the value to replace the old value
Display the new value during game play
In other words we are going to get and set a variable, this is one of the most fundamental concepts of programming.
Working with Variables
Once you are satisfied with your results, End the game and go back to the Content Browser. We will first need to add a variable to keep track of the player’s score. We will add this to the Pawn. Double-click the SideScrollerCharacter to enter it’s Blueprints editor.
Under the Components section, you will find a section called Variables. Variables are used to store information. This information can be a number, a string of text, a group of different values or various other types of data. As we would like our variable to keep track of the Player’s score our variable will be a number, more specifically our variable will be an integer data type. Integers are whole numbers like 0, 1, 300, -2 etc as opposed to floating point numbers which are numbers with a decimal place eg 2.12, 45.1, 78, 0982. As previously mentioned it’s considered best practices not to mix different types of data. However, as we will see a bit later through the process of casting, this can be possible. Bear in mind, though, casting can come at the expense of some additional computational resources (and perhaps bad practices too).
Click on the +Variable button to create a new variable attached to the currently selected character. Name the variable PickUpScore.
In the Details panel a section for editing the variable will now be available. Your new variable’s name will be displayed as well as it’s data type. As previously mentioned we will use the variable to keep track of our player’s score and as a result we will need this variable’s data type to be a number. As we will not be concerned will decimal numbers our variable data type will be integer. Click on the drop-down list and change it from the default value of boolean to integer. It’s worth noting that using integers when possible over floating point numbers can contribute to better managing a system’s resources, as with a float (as they are sometimes called) you would be concerned with precise values therefore more numbers are required for this accuracy. This could essentially result in more memory usage, unnecessarily.
Your new variable’s description should now currently look like something in the image.
We are going to initialize our new variable with a default value of zero. This makes sense as you would like your player to start with a value of 0 when the game starts and hopefully increase their score as they progress through the level.
In the Details panel you will find a section called Default Value, where you can specify the starting value of the variable. If you are unable to add the default value, then you will first need to Compile and Save your Blueprint before proceeding. Once the blueprint has been compiled the variable will now be accessible not only from the current blueprint editor but from other game elements too. This is significant as we will need to access this variable from the StarPickUp when a collision is detected between the pickup and the pawn. We can then modify the variable via the StarPickUp blueprint.
Creating the Scoring System
We will be adding the functionality that adjusts the player’s score to the StarPickUp. This makes sense as we will use the collision detection that we previously setup between the pawn and the pickup (that made the pickup disappear), to access the pawn’s variables, set a new value and replace the old value with the new value on the pawn.
If you are finding it difficult to visualize what we are doing, you can think of it in terms of the player holds the score and the pickup determines what to do with the score.
Double-click the StarPickUp to enter it’s blueprint editor. You will see that the previous Event graph we set up to destroy the pickup will be visible. We are going to modify this graph to include the functionality that updates the score before destroying the pickup.
We are going to start by getting access to the variables associated with the SideScrollerCharacter.
Click and drag the Other Actor pin to an empty space in the graph editor and drop the pin. Within the context sensitive menu start typing “sidescrollercharacter” (spaces and casing are not particularly important). Select Cast To SideScrollerCharacter from the menu.
Getting
A new Node appears with the relevant connections already wired. This node allows you to access other actors from the currently active selection. In our case the active selection would be the StarPickUp and the actor we are trying to access is the SideScrollerCharacter. By casting from our current blueprint class to another actor we can do things such as access data and functionality of the other actor, then use the results in our currently selected blueprint. This is precisely what we will be doing.
Click and drag a pin from As Side Scroller Character start typing the name of the variable (“pickupscore”), we created in this actor (our pawn). Options to Get and Set the variable will appear. We are, of course, first concerned with getting the variable and only later setting it once we have updated it’s value.
Select Get Pick Up Score from the menu options.
Once you have the new node wired your Event graph should like the image.
We are now going to perform some simple addition on the value that we just retrieved. Before continuing it might be a good time to reiterate on what we are doing. Now that we have access to the PickUpScore variable we are going to add a number to it, this number represents the value that the player’s score is incremented by each time they pass through a pickup. By getting the value that is set on the player we can augment that value with another integer. Therein resides the significance of why the value must be a variable and why the integer being added to it, does not need to be a variable.
In order to add a number to the PickUpScore variable, click and drag the Pick Up Score pin from the Get node and type “int + int” in the context sensitive menu. Select the integer + integer node.
You should now have an Event graph that resembles the image. The integer + integer Node has two input pins the first should have automatically been wired with the output from Pick Up Score, the second input is the value (number) that will be added to the first. You could specify another node as the input value or simply type an integer into the input field. As the value we are incrementing our score with does not change, we will simply input a number into the input field.
We now have a new number, that being the value that increments the players score plus the previous value of the players score. In other words, if our player’s score was 0 and we incremented it by 25 with the integer + integer node our expression would look something like 0 + 25. Therefore our new value would be 25. The next time a player passes through a pick our expression would look different, for example, 25 + 25 therefore the value would be 50 and so on.
At this point in time, the expression is being calculated but the result is not being stored for further use. As you can imagine the result should be reflected by the original variable PickUpScore, as we are ultimately performing this calculation to determine the players score. Bearing this in mind, we are now going to assign the new value back to the original variable after the calculation has been executed. Just as we obtained the variable in order to get it’s value we will follow a similar procedure in order to set its value.
Go back to the Cast To SideScrollCharacter node and from the same pin that you used to obtain the PickUpScore variable (As Side Scroller Character) drag another pin and again start typing “pickupscore”. However, this time choose Set Pick Up Score form the menu.
Setting
You will notice that the Set Pick Up Score node has three input pins, one executable pin and two data pins. Just as with the Get node it has a Target pin in order to retrieve the necessary information from the actor. A Pick Up Score input pin also exists that allows you to type in a specific value to set the variable within the input field or you could use another node to set the value dynamically. Of course, the latter is going to be what interests us. To reiterate we are setting the value of the variable dynamically from the integer + integer node.
In this example the second value of the integer + integer node has been set to 25.
Drag the output pin from integer + integer to the input pin of the set node’s Pick Up Score value. We have updated the PickUpScore variable on the SideScrollerCharacter with the value that has had the expression applied to it. To recap first we get the value from the actor, then we manipulated the value, and finally we returned the result back to the original variable, thereby setting the variable.
We have now completed the necessary mathematics operations in order to get our scoring system started. However, if you were to test the game at this stage there would be no visual indication of all of our hard work we just performed for two primary reasons. Firstly, and most importantly, although the code works (and you can verify this by clicking the Compile and Save buttons), it is never actually executed.
If you were to follow the flow of the script’s execution from the point at which the collision between the pawn and the actor (i.e. the SideScrollerCharacter and the StarPickUp) occurs, you will notice that it goes straight to the DestroyActor Node, thereby bypassing all of the new modifications we made to the graph.
In order to fix this we will need to disconnect the wire with DestoryActor and place that node at the end of the chain, after get and set have been executed.
Alt-click the execution wire going into DestroyActor’s input executable pin. This effectively deletes the wire but leaves the node intact.
Now that the executable output pin from the On Component Begin Overlap node is available again, click and drag the pin to the input executable pin of the the Cast to SideScrollerCharacter node
In following a logical flow of execution, click and drag the output executable pin from the Cast To SideScrollerCharacter node to the input executable pin of the Set node. As you will notice, in order for the Set node to complete execution, first the Get and Integer + Integer Nodes are executed. Subsequently, there is no need for these nodes to have executable pins.
Finally, reconnect the DestroyActor node to the end of the event graph by setting it’s input executable pin to the output executable pin of the Set node.
Your new event graph is now completed and it should look something like the image. Click Compile and Save before exiting the Blueprint Editor.
Debugging Tools
If at this point you were to test your game you would still not see any visual representation of the player’s score. However, this time around although you cannot see a difference, there is in fact, a fundamental difference to the game’s scoring system in that the code will successfully be executed.
Debugging in terms of software development relates to the process of removing errors and faults in a codebase. However, some errors might not necessarily be clearly visible when running your codebase and as a result, many development toolkits (Unreal Editor included) will provide tools to assist with this process, by exposing parts of how your application is running and if it’s performance is matching your expectations.
One of the debugging tools available to us in Unreal Editor is the Print String node. This node has the ability to display text while the game is running. As this is a debugging node, the text will only be visible while the game is run through the Editor, in other words debugging information is not (and should not) be retained when the game is built for a specific platform.
Lets go ahead and again delete the execution wire connected the Set and DestroyActor nodes. Although, the process we have taken of connecting and disconnecting nodes my seem unnecessary it is in fact very common to work like this within the Events Graph as you experiment with new functionality and change existing graphs. Drag an executable output pin from the Set node and start typing “printstring” select the Print String option and connect it’s output executable pin to the input executable pin of the DestroyActor node.
The Print String node has a data input pin called In String. By default this is simply set to a text value of “Hello”. This is basically what will be displayed during game play for the purposes of debugging and as you can image this is not presently particularly helpful for us. We would ultimately like this node to serve the purposes of printing the players score during game play and while in debugging mode.
We will need this value to be updated dynamically and as a result, we will use the Set node to update the value. Click and drag the output pin of the Set node’s Pick Up Score property. Drop the pin on the input In String pin of the Print String node.
When hovering this pin over the In String pin you will notice a message appear saying Convert Integer to String. As previously noted data pins are color coded and you might have noticed we are mixing colors here as we are attempting to connect an integer data type with a string data type. Fortunately in this case it is possible to do this as when you drop the pin the Editor creates an auto-casting node. This node performs data type casting (conversion) from an integer to a string, in this case, for us.
Compile and Save the blueprint before exiting the editor.
The result is that now when you play the game your character can pass through a pickup, the pickup is destroyed and just before this happens the players score is updated and displayed (for debugging purposes) in the viewport.
Coming up
If you have worked your way though the entire series then you have accomplished quite a great deal in terms of getting to grips with a fundamental understanding of working with Assets and the Blueprint Editor to create interactivity with your games.
In the next post we will be looking at developing a health system for the game, as well as adding assets with animation within our game and converting the debugging information to something more useful that our players can see.
What is a Software Framework A software framework provides a level of abstraction in writing application code relevant to an environment.Although this may sound like a mouthful, it’s quite simple when you start to unpack it. When we write code with JavaScript we are utilizing its environment to provide a context for our software and… Read more: Starting out with Vue : Part 1
Although HTTPS has been a standard web protocol for some time many hosting providers still do not necessarily provide support for enabling it “out-the-box”. In this post, we will dive into the details around setting up HTTPS for your website using a completely free set of tools and also discuss some basic concepts around what… Read more: How To Secure Your Site With HTTPS For Free
Lets dive into the details on the Minotaur character that has recently been published by RABBITMACHT. In this post, we are focusing on how you can incorporate The Minotaur 3D Digital Asset into your own projects by exploring how this character is built and effectively leveraging these resources in your own workflow. The Minotaur consists… Read more: A Super Versatile Minotaur, Rigged And Ready To Rumble
When developing 3D characters it’s important to retain as much of a non-destructive workflow as possible. This is particularly important with regards to games development as the models used in the final output will often have certain elements baked into texture maps. Without a non-destructive workflow, modifying a baked a texture map or rebuilding components… Read more: How To Build Game-ready Characters With A Non-destructive Workflow
Although 2020 may have been a tough year for many of us, that doesn’t mean we should be down and out about it. Some industries have even experienced growth in the time of the Corona Virus Pandemic. In particular, if you work in the e-commerce industry you might have encountered many client’s growth last year… Read more: Immediately Start Selling, with the Starter Store for Curated Merch
Welcome to the first in a series of posts on developing a 3D Sidescroller Racing Game. By following along with this series you will gain an understanding of what is required when working with Blender, Unreal Engine 4 as well as several other Open Source applications in developing a functional 3D game, that utilizes game mechanics similar to that found in Sonic the Hedgehog and gameplay inspired by Wipeout 2097.
In this post we will start by developing a 3D asset in Blender, that we will then import into Unreal Engine 4 (UE4). We will then use this asset within the Unreal Editor as an item within a side-scrolling game that acts as a pick-up. A pick-up is typically any item within a game world that is collectable by a game character. In terms of Unreal Engine-speak, we would refer to our game character as a Pawn and the pickup as an Actor. Simply put, our Pawn is the character we control within the game and it will interact with our Actor, the pickup object.
The focus of this post is not on the interaction between the Pawn and the Actor, but rather on the core concepts of developing an actor within UE4. We will develop the pickup 3D model in Blender with special consideration regarding the creation of assets for realtime applications. We will then create a Blueprint Actor with the 3D model in UE4 that has a glowing material and a rotates in-game like many a pickup of simmilar type from other games you may have played.
Prerequisites
?
No prior programming or game development knowledge is required, as we will unpack both the rationale and implementation behind the fundamentals of 3D games dev. We will also explore deployment to both mobile and desktop platforms in later posts.
You will, however, need a computer capable of running high-end 3D applications, as well as a decent internet connection and a commitment to some level of self-regulated learning.
In order to complete this tutorial, you will need both Blender and Unreal Editor 4 installed. If you would like to find out more about installing Unreal Editor, please read through this post.
You will not require anything more than a basic understanding of Blender’s modelling tools as well as some general background knowledge of 3D. If you are just starting from scratch then you should consider taking this free 3D course before continuing.
Building the Asset
When building game assets there are various factors to take into consideration, it’s always worthwhile keeping an eye of the polycount of your assets, making sure that you are not creating non-manifold geometry as well as creating edge loops that subdivide as evenly as possible as this will assist with creating different Level Of Detail (LOD) objects.
As a pickup is an asset that will be used many times throughout a game level, its polycount becomes exponentially significant to keep as low as possible. Keeping a low polycount is significant as it places less strain on the game’s rendering engine, this ultimately can contribute to making your game play smoother and prevent frame-skipping. Also bear in mind that, it may be a misconception to think that keeping a low polycount is only relevant if you are producing games for mobile devices. Although, the topic of keeping a low polycount is certainly recommended for mobile games its also worth remembering that not only is this a topic of concern for PC gaming but consoles games can benefit from the performance increase as well. When taking into consideration the distribution of a system’s resources for the purposes of rendering and making a game playable, consider that resources consumed for the purposes of transforming and rendering geometry could potentially have been used elsewhere to improve the quality of lighting in a scene or providing smoother physics simulations. In other words, thinking about balancing the load of the rendering engine is not just a consideration exclusively for software engineers but also something that artists and other technicians should always be aware of.
As performance is such an important topic in games development, we will be addressing it wherever possible.
Modelling and Asset Preparation
Our pickup is going to be in the shape of a star and although there are many ways in which you could model this in Blender one of the simplest approaches would be to use an application such as Inkscape.
If you use a primarily open-source production pipeline, then Inkscape is certainly a tool you might want to consider adding to your workflow. It is a cross-platform, Vector drawing application and can be used to generate splines that import seamlessly into Blender.
Inkscape has a set of premade shapes that are highly customisable including a star. Once the star was quickly created it was then saved as a vector in SVG (Scalable Vector Graphic) format. This is Inkscape’s native file format. From here the SVG is easily imported into Blender and works as any spline typically would. This is also very useful if you ever needed to tweak the vector graphic as Blender splines, particularly as all the bezier handles generated in Inkscape are retained in Blender.
From Blender simply import the SVG , using Blender’s SVG import plugin.
Once the SVG has been imported you can tweak it from here, using Blender’s spline modelling tools. When you are satisfied with the results select the curve and convert it to a mesh.
In order to retain as much control as possible of your 3D assets for your games, converting them to polygons within Blender (and before exporting them) will help you to visualize what the final results will look like. This will also afford you the benefit of being able to address any concerns with the geometry before importing them into UE4.
As a result, it’s recommended that if you have an exporter that does polygon conversion for you there are times when you might want to skip that option in order to retain as much control of the creation for such a critical asset as a pickup that will appear abundantly in some game levels.
Sometimes in Blender, when converting curves to a polygon object the results might not be what you expect. In our case, as we are dealing with a simple primitive shape although the conversion process resulted in malformed geometry with polygons of inconsistent dimensions, fixing this will be quick and easy.
In this case the internal edges of the star were selected then deleted, resulting in an outline of the star.
The edges making up the outline were then selected and extruded. This can be achieved in Blender’s Edit Mode by hitting the e key and moving the mouse/stylus etc in the viewport. To restrain the extrude to a particular axis hit the key corresponding to that axis. In this case, the key combination for extruding the outline of the star was e then z.
Before exporting assets such as props to UE4 you should always ensure that the object’s Center Of Mass (COM) is in the middle of the object’s volumetric boundaries. The only time when this might not be applicable is when working with characters. In this case, typically your object’s COM could be on the ground plane, in between your character’s feet or the middle of your character’s hip area.
Modelling Considerations for Realtime Applications
The location of the COM is significant with regards to a games engine as animations applied within the games engine will take the COM into consideration. In our case, we will be making the star rotate around an axis within the game level, this axis is defined by the center of the object’s volume within the game. As such ensuring that the object’s COM within Blender is also centralized to the object’s volume will result in the axes for rotating the object within the game aligning the with object’s COM as you have defined it through the modelling process.
Using Blender’s modelling tools create your pickup asset as you see fit. Some matters to take into consideration when creating your assets include
Try to keep your geometry minimalistic
Use triangles and quadrangles but avoid ngons
Recalculate your geometry’s Normals so that they always point away from the mesh’s interior. This can be achieved in Blender by selecting all vertices/faces and hitting shift-n.
At this stage, you might consider laying out your model’s UV’s. Although it is not always necessary, particularly when dealing with a light source 3D model if you wish to add any textures to your model within the game level then laying out of UV’s will be required.
When laying out UV’s try to ensure that they take up as much of the object’s texture space as possible (sometimes referred to as 0 to 1 texture space). This ensures that you have the greatest surface area in which to make the texture’s details visible and clear.
UV’s should be non-overlapping, for the sake of simplicity. Although UE4 does support multiple UV sets/channels which will allow a vertex in one location to inhabit more than one set of coordinates within multiple UV sets. This can simply result in additional overheads with regrads to system resources that can be utilized elsewhere.
UV islands should match your objects shape as close as possible, for example, the UV map of the star above looks like that of a star.
Once you are satisfied with your model it’s then time to export it to FBX.
FBX is a file interchange format that retains many useful attributes of your 3D assets such as some materials, animation, UV’s and many other properties.
It is a format that is supported by both Unreal Engine and Blender, however, do take into consideration that when you export your 3D assets from Blender to FBX you will be losing some editable qualities. As a result, it is always recommended that you keep your .blend files even after exporting your assets to FBX, in the event that any changes need to be applied to the original model.
Select the object you want to export, then from Blender’s File menu choose Export FBX. Make sure you have Selected Objects checked for the FBX Export options the other default settings in most cases will suffice.
Setup your Game Project in the Unreal Editor
We are going to be creating a 3D side scroller game, in which the character can race through levels collect pickups and try to avoid obstacles, very similar to the mechanics of Sonic the Hedgehog.
We will start by using Unreal’s Side Scroller Template. These templates are a great starting point for many different game types including First Person Perspective, 3rd Person Perspective and many others. A lot of the basics have already been built into the templates so starting with a template can ultimately save you a lot of time.
We will primarily make use of Unreal Editor’s Blueprint visual scripting interface. This is, as opposed to, primarily driving the game’s interactivity with our own C++ scripts. Even if you are very familiar with the C++ programing language, there are still many benefits to working with Blueprints as it remains an extremely versatile choice, capable of achieving results quickly and in some cases, the performance increase you gain from working exclusively with C++ might not even be noticeable.
Once you have your new project created you can test the default gameplay, simply by clicking the Play button in the main interface. You can then use your keyboard to control the character in the game.
Once you are done press ESC to return to the Editor.
You might notice that our game is missing some crucial components. We will start by adding the pickup we previously created in Blender to our game.
Import A 3D Mesh Asset
Below the 3D Viewport, you will find the Content Browser. This panel provides a visual representation of the components that make up the game. Assets not only include 3D meshes but also Blueprints, sounds, materials, animations and many other types.
As your projects grow in scale, the need to keep your assets logically ordered and therefore easier to find will become more relevant. As a result, we are going to keep things organized by importing the Star Pickup 3D Asset under the Geometry folder within the Content Browser. Open the Geometry folder and navigate to Meshes.
Click the Import button to bring up your system’s file chooser dialogue box. Then navigate to the directory where you saved the Star Pickup FBX file and import it into the scene.
Unreal will recognise the file type you are trying to import (it supports many different file types for importing). It will subsequently load the FBX Import Options dialogue box. Typically the default options will be suitable. Unreal is also generally capable of detecting if the FBX has animation or not and will subsequently check or uncheck the Skeletal Mesh option. As our asset does not have animation there is no need to have this option checked. You can find out more about importing meshes with skeletal animation here.
It’s worth noting that the mesh has not been imported into the game world. It is simply stored within the Editor for eventual use within the game. In order for us to use the asset in the game, we could drag the asset into the 3D viewport from the main editor interface. The result of this action would be simply placing a mesh in the game, it would have no exclusive, interactive properties. However, because we want our 3D model to have interactivity that we define, we will attach our asset to a Blueprint, then drag the Blueprint into the game world.
The benefit of the latter method is that every time we drag the Blueprint into the game world all of the functionality associated with the asset comes with it.
Create Your First Blueprint
As mentioned there are many benefits to working with Blueprints. We will make our first Blueprint which will essentially serve the purpose of being a pickup in our game.
Although it is not necessary to have any prior programming knowledge in order to use Blueprints, understanding some basic concepts will certainly help when it comes to creating your own Blueprint objects.
Blueprints are essentially a collection of properties and various functionalities. You can define a Blueprints functionalities as well as it’s properties and also access preexisting one too. If you have prior programming knowledge you could think of a Blueprint as a visual representation of a class within the context of the Object-oriented programming paradigm.
Within the Content Browser, open the SideScrollerBP folder.
Right-click on an empty area within the Content Browser panel and select Blueprint Class from the Create Basic Asset menu.
From the resulting Pick Parent Class dialogue box select Actor. In Unreal an Actor is any object that can be transformed within the game world. This, of course, covers a very broad spectrum of objects including creatures, hero’s, lights, collision objects the sky and the list goes on.
An Actor can be thought of as the most generic class of objects that are placable within the game world.
←
A Pawn is a type of Actor that is controllable by the player. In other words, a Pawn could be a vehicle, a robot, a person etc. It is the manifestation by which the player interacts with the game world.
←
A Character is a type of Pawn with more specialized functionality that enables it to walk, run, jump, swim and perform many other interactions with the game world that you could expect of a human-like or personified character.
You will now have a new Blueprint in your Content Browser. Name this Blueprint StarPickUp then double-click it to open the Blueprint editor with the StarPickUp loaded.
Working with Blueprint Components
There are three main sections within the Blueprint Editor that we are concerned with.
1. The panel on the left-hand side of the Editor is used to add components to the Blueprint. These components can be various types of assets that you have imported into the Editor (such as the Star Pickup mesh) or assets that are already a part of UE4. You can also add unique properties to Blueprints here.
2. The middle section of the Blueprint Editor is primarily for the purposes of visualizing the Blueprint, how all the components relate to each other as well as providing the ability to modify these connections through a Node-based interface.
3. The section on the right consists of the main panel for editing the details of a component or other selection.
Click on the Add Component drop-down and search for Static Mesh.
Add the Static Mesh component to your Blueprint. We will now be able to edit the Static Mesh component by including the Star Pickup mesh to our Blueprint.
If at this point you are struggling to visualize what we are attempting to achieve you can think about our project in terms of a hierarchy.
At the top of the heirachy is the game level we are currently working on. This game level consists of all our Blueprints, Characters, mesh objects sounds etc.
What we are doing is currently adding a Blueprint to this level. This Blueprint is made up of a Static Mesh. The Static Mesh is made up of Geometry (polygons) and Materials. The Blueprint itself is also going to include some functionality and properties that will make it possible for our Character and even other Actors in the game world to interact with it.
In the Details panel on the right under the section called Static Mesh, you will find a drop-down list. Select the Star mesh that you previously imported and it will now appear in the viewport panel in the middle section of the Blueprint editor.
In the same Details section, you will also find the interface for modifying the Static Meshes Transforms. It’s worth noting that you are not editing the Blueprint’s Transforms here. As previously mentioned the Blueprint type we are working on is an Actor and Actors are placeable within the game world, therefore this type of Blueprint will also have transforms associated with it. Again it is best to visualize this in terms of a hierarchy, transforms applied to the Blueprint will have an indirect effect on how the Static Mesh is displayed in the game world. The Static Mesh also has its own transforms, which is what we are editing here. These transforms do not affect the Blueprints transforms as the Static Mesh can be considered as a child of the Blueprint.
Now that we have our Star mesh imported and connected to our Blueprint we are going to work on making it look somewhat more interesting by adding some simple animation as well as making it glow. At a later stage, we will add the functionality that turns it into an actual pickup which augments our characters score during gameplay.
Within the same Blueprint Editor for the StarPickUp add a new component called Rotating Movement. In order to add components to a Blueprint follow the same steps, we took for adding the Static Mesh. This component will be used to add a simple rotation animation to the pickup. You can specify the axis around which you would like the rotation to occur in the Details panel.
Once you are satisfied with your edits you must compile the results to ensure that there are no errors. As a reminder Blueprints provide a visual representation of the code that would be required to make your game interactive. Compiling, in terms of programming, is the conversion process required between one type of programming language into another. As such, you can think of Compiling in terms of Blueprints as converting the work you have done from a format that is human-readable into a format that is more machine-readable and can therefore run more effectively on a variety of different systems.
Once you have compiled and saved your Blueprint, close the Blueprint Editor. Back in the main Editor click and drag the StarPickUp Blueprint you just saved from the Content Browser into the viewport. If you recall the type of Blueprint we created was an Actor and as previously mentioned Actors are Blueprints that are placable within the game world. As a result, when you drag the asset into the viewport you will notice that arrows appear in order to translate (or move) the asset around in 3D space.
Once you have placed your asset into the viewport, you can test your placement and how your character interacts with your asset by clicking the Play button.
Exit Game mode to return to the main editing interface and double click on the StarPickUp Blueprint in the Content Browser.
You might notice that this time around when the Blueprint Editor opens, instead of seeing the Star Static Mesh you see a grid, where the Viewport previously was (in the middle section of the Editor). This is likely the Event Graph and its main purpose is to provide a visualization of the properties and functionalities of the Blueprint. As we do not have any yet, the interface might seem pretty empty. At a later stage, we will add functionality to the StarPickUp that will allow us to interact with it during gameplay.
If you want to return to the interface showing the static mesh, click on the Viewport tab. From here you can select the Star Static Mesh. You can also simply select the static mesh from the Content Browser and double-click it for editing.
If you are currently in the Blueprint Editor with the Star Static Mesh selected, in the Details panel on the right section of the Editor you will find a menu for the Mesh’s Materials. You can double-click on the swatch (which should look like the thumbnail of a sphere) and this will open the Material Editor.
Node-based Editing for Materials
Unreal Editor provides an intuitive Node-based editing system that is consistent not only for editing materials but also for designing interactions with Blueprints. If you have previously worked with Blender’s node-based editing system when building materials you will find many similarities in terms of building materials within the Unreal Editor too. Nodes can be added by right-clicking on an empty area within the grid area (middle section) of the Material Editor. You can create connections between Nodes by dragging handles from one Node’s socket to another Node’s socket. When you attempt to make a connection that is invalid the Editor will prompt you with a message in red text.
In order for your mesh to render it will require a material, how that material renders will be determined by the connections we make between the various nodes in the Material Editor.
A default Material is created when you import a Static Mesh that does not already have a predetermined material associated with it.
In our case, we have a material with a Scalar Parameter connected to the material’s Base Color. You can use this Parameter Node as an easy way to change the main color of your 3D object. This channel is also similarly referred to as the Diffuse Color in other applications.
Simply click on the swatch and a color picker will appear. Change the color, keeping an eye on the Old and New values, then click OK. The Parameter Node is made up of multiple sockets consisting of composite, red, green, blue and alpha channels. Any channel that can accept a number as an input can accept red, green, blue and alpha as an input. Only channels that can accept multiple values will be able to accept the composite channel as an input.
If working with Node’s in this way seems somewhat confusing, not to worry as we will be revisiting the topic throughout the series in various different interfaces. It will take a bit of practice to understand what qualifies as a valid input, but it certainly does not need to be a complex topic for concern at this stage.
We are going to start off by keeping things as simple as possible, as a result, we will use the existing parameter node in a series of connections that contribute towards making our star pickup glow.
You can disconnect two nodes by holding Alt and clicking on the connection you want to delete. To be clear we don’t want to delete the Node we simply want to delete the connection between the Base Color and Parameter composite channels, therefore just delete the connector.
We are now going to add a Node called a Constant. A constant is a special type of value (or data) that does not change during gameplay. With this in mind we will use our constant to set a value that we use to control how much we want our Star PickUp to glow.
Right-click on an empty area within the Material editor grid viewport to add a new Node. In the search field that appears type constant and select Constant under the Constants section that is filtered by your search.
A Constant Node will appear. You can edit its value by selecting the Node and giving it a value in the Details panel. Setting this value to a number anything above 10 will make the glow on your object more distinguishable. The higher the value the greater the glow, in some cases a value of 50 might be used to achieve a good balance.
As you will notice your material will still not be glowing as of yet, this is primarily because we have not finished setting up the rest of the Nodes and nor have we completed making the appropriate connections.
Next, we will add a Multiply Node. Again, right-click on an empty area and search for multiply. The Multiply Node consists of two inputs and one output.
Input channels of Nodes will always be on the left of the Node and output channels on the right-hand side. Logically, when making connections between Nodes drag an output handle to the appropriate Node’s input channel. This is not something you would typically ever do on the same Node.
We are now going to finish our material by making the appropriate connections. As noted the Multiply Node has two input channels, the purpose of the node is to take an input from channel A, multiply it with the Value as input from channel B then return the result through the output.
In our case, this is significant because all color channels have a minimum and maximum setting depending on the software you are using this could be represented as a range between 0 to 1 or 0 to 255. When pushing these representations of color values beyond their predetermined ranges, the result is that the material can start to appear as being self-illuminated. By taking the results of these exaggerated values and returning them to the Emissive Color channel of a material, a bloom effect is then achieved. This bloom effect is what creates the impression of the object glowing.
It is, however, worth noting that the object is not necessarily glowing, as if it were glowing it would also be casting light on the environment in which it is placed. Therefore, the effect is simply a simulation that does not match real-world physics unless you are using a more current build of UE4. In builds 4.6 and greater the emissive material is, in fact, able to cast light on the environment.
Although it is possible to simulate the effect of the material lighting the environment in versions of Unreal less then build 4.6 you would have to consider how this will affect performance during gameplay.
As previously mentioned performance is of great significance when developing games. Take into consideration that this Star PickUp will appear many times within a level. To visualize this more clearly think about how many times coins appear within a Sonic the Hedgehog level. As a result, wherever it is possible to improve on the performance of an asset during gameplay, it is worth taking into consideration the balance between the quality of rendering versus performance. However, in some cases, you might be able to plan the implementation of an asset without having to make compromises yet still gain from the potential of a performance increase.
In our case, this is crucial when taking into consideration how many times the Star PickUp will appear within a single level as each instance added to the viewport has the potential to negatively impact on the game’s performance, in other words how smooth the game plays or whether it skips frames on some devices.
Unreal provides various Shading Models when working with materials. These Shading models have been optimized in some way to reproduce specific qualities of real-world materials such as cloth, hair and many other types. However, one of the Shading models does not represent a real-world material, as it is intended to ignore the physical properties of light. By setting the Star PickUp’s Shading Model to Unlit we are just using the materials Emmissive Channel to control the visible qualities of light related to the object. This could potentially result in a significant performance increase as physically-based light properties do not need to be calculated for every instance of the Star PickUp.
On the other hand if the object you wanted to add a glow to also required physically-based lighting properties, then the Unlit Shading Model would not work. For example, if your model also made use of a Normal map or had specular highlights.
Once you are happy with your pickup’s material click the Apply then Save buttons and close the Material Editor.
When you are back in the main editor, drag a copy of the Blueprint asset you just created into the viewport if it does not already exist, or drag out as many copies as you want. Then go ahead and test your game.
In this post, we were introduced to many of the basic editing interfaces that the Unreal Editor uses consistently for different components making up a game. We also touched on some topics that should be taken into consideration when exporting assets from Blender for UE4.
In the next installment of this series, we’re going to take a look at how to add interactivity to the assets we imported as well as how to create animated assets in Blender that will be imported into our game.
With the advent of Blender changing it’s polygon engine to include Ngons, as opposed to only supporting triangles and quadrangles (post-Blender version 2.63 circa 2012), a new mesh system known as Bmesh was introduced to Blender and brought with it several new features.
Besides the obvious advantages of creating organic and some inorganic models-types faster, Ngons also bring an addition to Blender’s sculpting toolset in the form of Dyntopo.
Ngons 101
Although Ngons are nothing new in the world of 3D, their utilization was for many years considered bad practice among many 3D professionals. This bad rap primarily stems from their unpredictable behaviour when it comes to deformation during animation. As you can imagine it is not of primary concern when ngones are utilized within a sculpt workflow, as deformation of a mesh is typically only for rudimentary purposes. There is generally no particular reason to retain transformation or deformation historical data when sculpting as this would typically be applied or baked into the mesh once sculpting resumes. Many 3D applications will simply remove this historical data by default, so as to further prevent unpredictable results.
Building Meshes in the Early to Mid 2000’s
Prior to the massive uptake of 3D sculpting from as late as 2009, modelling was considered the de facto goto for building meshes. There were various options for producing outcomes matching different purposes such as box-modelling for organic surfaces, vertex pushing for real-time models or even NURBS modelling for industrial design, to name a few popular choices. However, the main consideration was that regardless of your choice in terms of modelling a mesh, that mesh would intrinsically be linked to to the final outcome (in some way or another). Therein resided the temperament of the era advocating the use of quadrangles and triangles as part of the mesh building process.
However a new implementation of 3D Sculpting was to change this fundamentally, towards the latter part of the 2000’s.
By decoupling the process of mesh building from the final outcome a new genre of 3D artist was ushered in. Unconcerned with the technicalities of modelling or the limitations of the final outcome, sculpting became truly modular and with this we saw the surfacing of many new (some old) sculpting software rising to the foreground, performing one task, primarily, that being to sculpt. 3D-Coat, Mudbox, ZBrush and of course Sculptris are some names that may come to mind. Of those perhaps, Sculptris (the only defunct sculpting software from that list) arguably accelerated sculpting into this new aforementioned era by implementing the first stable, non-commercial form of dynamic tessellation in 2009, when Sculptris reached version 1.
Although Sculptris was, shortly after reaching version 1.0, to be acquired by Pixologic (the makers of ZBrush), dynamic tessellation had already made its waves in the 3D community and we were hungry for it in our beloved Blender.
A Match Made in Heaven
In an almost synonymous and parallel timeline with Sculptris’ introduction of Dynamic tessellation to the 3D world, Blender developers were readdressing the limitations of the old mesh editing system. This system was slowly being replaced with BMesh and early adopters were able to start testing and working with the new system obtainable from GraphicAll, in 2011 (and possibly even before this, in less stable implementations).
BMesh is the underlying mesh editing system used by Blender. It is essentially a programmatic representation of the topology that comprises meshes rendered in the 3D viewports. It has its own API (Applications Programming Interface) which exposes very low-level almost C-like data structures for mesh manipulation and fortunately, if you are only concerned about making art with Blender, you’d never need to know any of that.
Prior to the inclusion of BMesh all geometry rendered in a 3D viewport in Blender could only be made up of a four-sided quadrangle or a three-sided triangle. In order to construct a complex 3D model these simple “building blocks” must then be arranged to represent the shape or form of the model. Of course, this is all relative to the degree of control you want when building your models, as much of the process of assembling the building blocks would typically be automated to some degree.
Destructive vs Non-destructive Sculpting
To clarify BMesh is not responsible for making sculpting in Blender possible, however, we certainly can attribute it for bringing to Blender a more natural and spontaneous approach to sculpting, affording artists the kind of creative freedom that alleviates many technical considerations.
Sculpting in Blender predated BMesh by approximately half a decade having officially premiered in Blender 2.43 in early 2007. Fundamentally, the difference that BMesh brought to sculpting was in a destructive approach to realtime retopologizing, much akin to that of dynamic tessellation. This, it can be said, was to be the inspiration for what was to become Dyntopo.
Blender’s sculpting toolkit was, at the time of its inception, very much on par with that of its commercial counterparts with Maya having introduced sculpting from as early as Maya 3 which was released in early 2000. Although this timeline might lead you to think that Blender was slow to catch up, in fact, this was not the case. Sculpting just never really took off in 3D until many years later, with the advent of dynamic topology. Prior to this Blender’s modelling toolkit pioneered an impressive array of intuitive tools making it a formidable choice for any serious modeller. The point is simply, that the focus back then was not the same. It was more about creating beautiful, minimalist, topology quickly and efficiently. Arguably, Blender delivered in this respect more so than it’s proprietary counterparts.
Sure, sculpting was available but as was the case with most high-end 3D suites of the time, to fulfil the promise of being able to create highly detailed models the trade-off was typically an exceptionally high-resolution mesh. This was often the result of geometry that subdivided in areas that were just simply not necessary. One of the side-effects of non-destructive sculpting at the time.
As you would expect, the great thing about sculpting utilizing non-destructive techniques is that your model retains a history. That history is a hierarchy of the deformations applied to the model by sculpting the low-resolution model which in turn contributes to the outcomes of the medium resolution model, contributing to the outcomes of the higher resolutions and so forth. There certainly are still many benefits regarding this approach to sculpting, if you are not already familiar with them you can read more about it here.
Nonetheless, it was with the advent of Dyntopo that an insurgence in a new breed of 3D artist was born.
Enter Dyntopo
Dyntopo, although it is a portmanteau for Dynamic Topology, is that and much more. Unlike typical, non-destructive sculpting which only affects the shape of the model, a Dyntpo mesh is able to both have its shape as well as its topology influenced by certain sculpt brushes. Although the idea of shape or deformation refactoring topology might seem contradictory given that topology is typically preserved under deformation, you should also bear in mind that the topology is effectively ‘dynamic’ and therein resides its namesake. In other words, the old topology is erased and rebuilt, ideally in realtime, with each brushstroke. It is this refactoring of topology that the destructive nature of this sculpting technology resides as well as its greatest asset, that being the ability to increase a model’s resolution only where it is necessary.
Also worth noting is that preservation of topology is typically not an objective artists pursue when working with Dyntopo, in fact, the opposite is true. As previously noted working with Dyntopo relieves artists of the technical requirements associated with modelling.
In order to create meshes that deform predictably for animation or other such circumstances from a mesh that was created using Dyntopo, the mesh should first be retopologized.
Retrospectively working with Dyntopo and BMesh
Fortunately, if you are using a version of Blender greater than 2.62 then BMesh is not something you really need to be too concerned about. As BMesh was intended to replace the old mesh system, Blender versions greater than and including 2.63 will use it by default, this means that geometry can consist of triangles, quadrangles, ngons or combinations of any type. All you need to do is simply use Blender’s extensive set of polygon modelling and sculpting tools to create whichever type of polygon you desire and let Blender figure out the rest. The only time you need to be aware of which mesh system you are using is if you wish to import a model created with the new BMesh system into a version of Blender older then 2.63, for whatever that reason may be. In that case, your model might need to be quadrangulated or triangulated then either saved to a legacy format or exported to a format such as OBJ. Of course, this will disregard all animation, rigging and various other Blender specifics.
Dyntopo on the other hand can be turned on and off when working with Blender’s sculpt tools. The results, as noted above, are that you will effectively be using a destructive or non-destructive sculpting technique, respectively.
Enabling Dyntopo in older Blender versions required clicking the Enable Dyntopo button. This was available from sculpt mode, as a tool option for applicable brushes.
In more recent versions of Blender the button has been replaced with a checkbox. Accessing Dyntopo remains relative to the sculpt mode interface as well as a part of tool configurations. Over the years, improvements have been made to various aspects of the tool. Significantly, this includes the ability to gain a greater degree of control over how detailing in dynamically generated topology relates to camera proximity or even the size of the brush.
Retopologizing a Dyntopo Model
Not all models created with dynamic sculpting need to be retopologized. A few of the reasons you might consider retopologizing your model include if you want to create a workable UV layout that can be used for texture painting/mapping etc, if your model is intended for animation especially character-based animation, if you want to export your model to another 3D application or games engine working with a high poly count could significantly impair performance.
Retopologizing is a modelling technique that is used to create a mesh with more suitable, usually realtime, topology. This is in contrast to the topology of a sculpted mesh and is a technique used for various reasons including the aforementioned.
There are many free and commercial tools for assisting or automating the process of retopologizing. However, if you have a background in 3D modelling this will certainly be to your advantage. Sometimes the simplest options might provide the best solutions, bearing this in mind and in terms of retopologizing you could find all the solace you need in the snap to faces button. Once this button is enabled, it’s just a simple case of extruding vertices to create the desired edge loops. Then selecting edges and creating polygons. It’s always very tempting to create a fully quadrangulated mesh through retopologizing however this is not always necessary and in certain instances when the mesh is intended for a static shot using ngons in the model could speed up the modelling process and have no visible effect on the quality of the rendering. Once a retopologized mesh is created, it’s a simple case of creating a normal map if desired and/or using multires and shrinkwrap modifiers to retain the sculpted details and maintain a manageable, deformable model. If you are feeling a bit lost at this point you can read more about it here.
The Future of Sculpting in Blender
Blender’s sculpting technologies are actively being developed while sculptors have been receiving a rapidly expanding toolset over the past decade that rivals any proprietary solutions. One could even say that things have come full circle in the sense that where the technology deviated from a non-destructive workflow emphasising the use of the multires modifier, we are now seeing new hybrid workflows that utilize aspects of both destructive and non-destructive techniques. But does that mean sculpting is becoming more of a technical than artistic skill? Not in the least bit, in fact the emphasis in the question might be misguided given that many technical artists embrace such cross-overs and create not just art but the tools with which to make it.
If you want to learn more about some of the new and exciting tools added to Blender’s sculpting toolkit, keep an eye on the Blender Developers Blog.
Now that you have Unreal Engine and Editor up and running, it’s time to start migrating your digital assets from Blender into the Editor. In this post we will discuss some of the prerequisites for building certain assets in Blender, then exporting and finally importing and setup within Unreal Editor.
Build the model in Blender
In fact, you are not limited to Blender in terms of building your models for Unreal Engine 4 (UE4) however, we will be focusing on Blender as it provides a great deal of versatility in terms of content creation. We will focus on an animated 3D character that will be exported via the FBX file format, then imported into UE4. When building assets such as this for a real-time engine there are certain considerations to take note of. Working with an application such as Blender in conjunction with UE4 requires a certain degree of understanding as to what role each software plays in the production pipeline. In other words, there are certain tasks that Blender performs well that should be completed before a model is exported for UE4. Among these considerations are,
Topology
Non-overlapping UV’s
Texture mapping
Rigging and Animation
Model Considerations
Topology: Build your models without non-manifold geometry and with edge loops that are placed with consideration for accurate deformation during animation. Although, it will require some experience in getting this right you can start by learning about the basics on this free course.
Non-overlapping UV’s and Texturing: Your model should have it’s UV’s laid out before it is exported from Blender. When unwrapping your model’s UVs to export for a real-time application, its generally best to maximize usage of the 0 to 1 UV texture space, keep your textures square with dimensions between the 4th to 13th power of 2 exponents. You can read more about understanding UV’s here.
Rigging and Animation: Blender has a vast toolset for creating simple to exceptionally complex animations and supporting the content creation process with advanced rigging features. Although you can create animation in UE4 and will certainly encounter times when this is necessary, mastering animation in Blender will certainly be beneficial with regards to raising the quality of your final output significantly. Learning animation is a time-consuming process if this is not in your interests simply utilize readily animated models.
Exporting a Model and Rig
Once you have your model prepped with all the above criteria checked, it’s time to begin the export process. In the interest of keeping things simple, we are not going to focus on export settings and the finer details involved therein. Simply, we will examine the process as an overview in order to equip you with the tools to get an animated character from Blender into Unreal Engine 4. Taking a research-based approach to learning, from that point, will help accelerate your project to the next level.
As mentioned in a previous post, when exporting a model, Location and Rotation transforms should be 0 for all axes.
Scale transforms should be at 1. This is both applicable for the model and for the armature.
When choosing a unit of measurement such as Unit system, Metric or Imperial it would be considered best practices to remain consistent with the same system throughout the entire project. Doing so could contribute towards creating models that are imported at the correct scale, and ultimately save you some time.
You will not require any animation for this first step. All that is required is that the model is bound to the rig and adequately weight painted prior to exporting. With the rig selected go to frame zero, and enter the model’s Rest Position if it is not already in rest pose. If there are any objects in the scene other than the model and rig, such as lights, camera or other objects simply delete them. Ideally, just the 3D model and rig should remain.
With your rig selected enter Pose mode (from the 3D Viewport not Pose Position as opposed to Rest Position i.e. remain in Rest Position) and rename the top-level parent bone, that is the highest bone in the rig’s hierarchy, to Root.
To reiterate on the process we will first be exporting just the model and rig. Once we are happy with how this has imported into UE4 we will then continue to export the animation.
As we are only interested in the Armature and Mesh, select those object types from Blender’s FBX export options dialog. Although if you already deleted all other object types as previously noted then your object types would have been implicitly set.
In the interest of separating animation out of this file, if any already existed, it is recommended to uncheck the Bake Animation setting at first.
When exporting your media to UE4, keeping things as simple as possible is key. Therefore if you are able to create a workflow that enables only exporting Deform Bones successfully then, checking the Only Deform Bones option can help to significantly reduce file sizes and speed up the import/export process.
Bear in mind that if you choose this option for the initial exporting of the character and rig, you should keep this option checked for exporting all remaining animations.
You are now ready to export the file. Save the file to a unique name.
Project Setup
Launch Unreal Editor to configure a new project. If you have not installed UE4 yet start here. Select Game for the type of project and Blank for the Project Template.
Choose Project settings that match your systems capabilities as well as your intended output. You can always change these settings at a later stage. There is no need to include Starter content as we will be importing our own content. We won’t be covering Blueprints in this particular post.
Level Setup
Once you have your project setup, the launcher will exit and the Editor will open .
The primary areas we will be focusing on within the editor are the
Content Browser: This is, by default, located at the bottom of the screen.
Viewport: The viewport is used for a close representation of what your final output may look like. It’s worth, however, noting that your game’s performance should not be measured by that of the viewport’s.
Preview: A preview creates a much more accurate representation of your applications performance and appearance and therefore the final outcome.
From the Content Browser click Add New and choose New Folder. A new folder will appear within the Content Browser. Rename the folder to your liking, it will be used to store the assets you import into your scene.
Import the Mesh and Rig
With the new folder you just created open, click the Import button within the Content Browser. A file dialog will appear, navigate to where you exported your files from Blender.
Select the FBX file you exported with no animation. This file should only contain the 3D mesh and Rig.
You will be presented with FBX Import Options within Unreal Editor. Ensure that Skeletal Mesh and Import Mesh are checked, Import Animations should be unchecked.
If you chose to Export from Blender with Only Deform Bones selected, you may receive a warning within Unreal. This is not an error, it is simply a warning and should be expected due to the noted export configurations. You are free to close the dialog as it will have no impact on your workflow in this instance.
Several new assets will now appear within the Content Browser. Depending on how you exported your asset from Blender, you should as a minimum requirement see the 3D Mesh object, a skeleton (a representation of your Blender Armature), a material node and a physics node.
Import the Animation
At this point you are now ready to begin importing your animation.
From the Content Browser click the Import button again, this time select the FBX you exported with the animation.
It’s worth noting at this point that since you have already imported your 3D mesh, in the interests of keeping your assets at a small and manageable size it is possible to delete the mesh from this file entirely and retain only the Armature with animation. This is something you could have done in Blender before exporting the animation.
Either way you can always simply delete the duplicate 3D mesh from the Unreal Editor Content Browser.
When importing the animation Unreal should be able to detect the rig it applies to. It will, therefore, select the rig, as can be seen from within the FBX Import Options dialog under the Skeleton section. You can also click on the drop-down menu to manually select the appropriate rig.
Ensure that Import Animations is checked this time.
Again you will notice that several new assets have been created. To preview the animation double click the Animation Sequence asset.
To place your animated character into the Level, click and drag the Animated Sequence asset into the Viewport. You will be able to place it on the ground plane.
Create and Edit Materials
Now that you have imported your animated character into your Level, its time to turn your attention to editing it’s materials.
When your 3D mesh was imported by default Unreal would have created a new texture for it.
Simply double click this material from the Content Browser and the Material Editor will open with the noted material loaded for editing.
Let’s now examine a quick material node setup.
Before we start setting up our material you will need some textures to apply to the material. Textures can either be created by hand, from photo references or by baking from Blender. How you choose to create your model’s textures will be determined by the type of outcome in terms of the aesthetic and technical requirements of your project. You can learn more about the texturing process in this free course.
To import your textures into the project simply follow the same procedure for importing any asset, from the Content Browser click Import.
Once your textures have been added to your project it is time to connect them to your model’s material. Double click the material in the Content Browser to open the Material Editor.
Depending on what Shading model you have selected for your material certain material inputs will be available and some not. If your model is of an organic character you could choose the SubSurface shading model, thereby making the SubSurface Material Input available.
In order to add the textures, you imported to the material inputs, hold the “t” button on your keyboard and left-click in an empty area within the material editor’s graph-like background. A new Texture Sample node is created.
In the Texture Base section of your Material details panel (by default on the left-hand side of the screen), you will find a drop-down menu allowing you to select one of the textures you imported. If you are connecting the model’s base color choose your diffuse/color texture, if you are connecting a Normal map choose the Normal map texture you baked from Blender and so forth. You can read more about the different texture types here.
With the correct texture selected you are now ready to connect the texture to your material. There are a variety of different possibilities in terms of how to make this connection but for the sake of simplicity, let’s assume you are attempting to connect a diffuse/color map to your material. In this event click and drag from the Texture Sample RBG output and a connector will appear. Hover over the material’s Base color Input and you will notice a green checkbox appear. This indicates a valid connection. When you see this connection, stop dragging and the connection will be made.
Continue to connect and experiment with other textures in a similar way, until you have connected all of the textures you have imported. You can preview how the changes you make to your material affect your model as you edit your material. Simply click the Apply button in the material editor and position it such that you can see both the material editor and the viewport simultaneously.
Once you are satisfied with your material, click Apply then Save in the Material Editor. You can now close the Material Editor and your your materials will be visible on the model within the current Level.
Exclusive offer for Learners
Jumpstart your learning with this exclusive limited offer for RabbitMacht learners. Get the elephant model used in this post with textures, rig and animation at a phenomenal 50% discount! Apply the coupon code below at checkout and get your Royalty Free model to use in your own commercial and non-commercial projects.
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here:
Cookie Policy