IntroductionA few Wizcorp engineers participated in Unite Tokyo 2018 in order to learn more about the future of Unity and how to use it for our future projects. Unite Tokyo is a 3-day event held by Unity in different major cities, including Seoul, San Francisco and Tokyo. It takes the form of conferences made by various Unity employees around the globe, where they give an insight on some existing or future technologies and teach people about them. You can find more information about Unite here.In retrospective, here is a summary of what we've learned or found exciting, and that could be useful for the future of Wizcorp.Introduction first dayThe presentation on ProBuilder was very interesting. It showed how to quickly make levels in a way similar to Tomb Raider for example. You can use blocks, slopes, snap them to grid, quickly add prefabs inside and test all without leaving the editor, speeding up the development process tremendously.They made a presentation on ShaderGraph. You may already be aware about it, but in case you're not, it's worth checking it out.They talked about the lightweight pipeline, which provides a new modular architecture to Unity, in the goal of getting it to run on smaller devices. In our case, that means that we could get a web app in something as little as 72 kilobytes! If it delivers as expected (end of 2018), it may seriously compromise the need to stick to web technologies.They showed a playable web ad that loads and plays within one second over wifi. It then drives the player to the App Store. They think that this is a better way to advertise your game.They have a new tool set for the automotive industry, allowing to make very good looking simulations with models from real cars.They are making Unity Hack Week events around the globe. Check that out if you are not aware about it.They introduced the Burst compiler, which aims to take advantage of the multi-core processors and generates code with math and vector floating point units in mind, optimizing for the target hardware and providing substantial runtime performance improvements.They presented improvements in the field of AR, typically with a game that is playing on a sheet that you're holding on your hand.Anime-style renderingThey presented the processes that they use in Unity to approach as close as possible Anime style rendering, and the result was very interesting. Nothing is rocket science though, it includes mostly effects that you would use in other games, such as full screen distortion, blur, bloom, synthesis on an HDR buffer, cloud shading, a weather system through usage of fog, skybox color config and fiddling with the character lighting volume.Optimization of mobile games by Bandai NamcoIn Idolmaster, a typical stage scene has 15k polygons only, and a character has a little more than that. They make the whole stage texture fit on a 1024x1024 texture for performance.For post processing, they have DoF, bloom, blur, flare and 1280x720 as a reference resolution (with MSAA).The project was started as an experiment in April of 2016, then was started officially in January of 2017, then released on June 29th of the same year.They mentioned taking care about minimizing draw calls, calls to SetPassCall(DrawCall).They use texture atlases with index vertex buffers to reduce memory and include performance.They used the snapdragon profiler to optimise for the target platforms. They would use an approach where they try, improve, try again and then stop when it's good enough.One of the big challenges was to have lives with 13 people (lots of polys / info).Unity profiling and performance improvementsThis presentation was made by someone who audits commercial games and gives them support on how to improve the performance or fix bugs.http://github.com/MarkUnity/AssetAuditorMipmaps add 33% to texture size, try to avoid.Enabling read/write in a texture asset always adds 50% to the texture size since it needs to remain in main memory. Same for meshes.Vertex compression (in player settings) just uses half precision floating points for vertices.Play with animation compression settings.ETC Crunch textures are decrunched on the CPU, so be careful about the additional load.Beware about animation culling: when offscreen, culled animations will not be processed (like disabled), and with non-deterministic animations this means that if disabled, when it's enabled again, it will have to be computed for all the time where it was disabled, which may create a huge CPU peak (can happen when disabling and then re-enabling an object too).Presentation of Little ChampionsLooks like a nice game.Was started on Unity 5.x and was then ported on to Unity 2017.x.They do their own custom physics processes, by using WaitForFixedUpdate from within FixedUpdate. The OnTriggerXXX and OnCollisionXXX handlers are called afterwards.They have a very nice level editor for iPad that they used during development. They say it was the key to creating nice puzzle levels, to test them quickly, fix and try again, all from the final device where the game is going to be run on.Machine learningA very interesting presentation that showed how to teach a computer to play a simple Wipeout clone. It was probably the simplest you could get it (since you only play left or right, and look out for walls using 8 ray casts.I can enthusiastically suggest that you read about machine learning yourself, since there's not really room for a full explanation of the concepts approached there in this small article. But the presenter was excellent.Some concepts:You have two training methods: one is reinforcement learning (where you learn through rewards, trial and error, super-speed simulation, so that the agent becomes "mathematically optimal" at task) and one is imitation learning (like humans, learning through demonstrations, without rewards, requiring real-time interaction).You can also use cooperative agents (one brain - the teacher, and two agents - like players, or hands - playing together towards a given goal).Learning environment: Agent <- Brain <- Academy <- Tensorflow (for training AIs).TimelineTimeline is a plugin for Unity that is designed to create animations that manipulate the entire scene based on time, a bit like Adobe Premiere .It consists of tracks, with clips which animate properties (a bit like the default animation system). It's very similar but adds a lot of features that are more aimed towards creating movies (typically for cut scenes). For example, animations can blend among each other.The demo he showed us was very interesting, it used it to create a RTS game entirely.Every section would be scripted (reaction of enemies, cut scenes, etc.) and using conditions the track head would move and execute the appropriate section of scripted gameplay.He also showed a visual novel like system (where input was waited on to proceed forward).He also showed a space shooter. The movement and patterns of bullet, enemies, then waves and full levels would be made into tracks, and those tracks would be combined at the appropriate hierarchical level.Ideas of use for Timeline: rhythm game, endless runner, ...On a personal note I like his idea: he gave himself one week to try creating a game using as much as possible this technology so that he could see what it's worth.What was interesting (and hard to summarize in a few lines here, but I recommend checking it out) is that he uses Timeline alternatively to dictate the gameplay and sometimes for the opposite. Used wisely it can be a great game design tool, to quickly build a prototype.Timeline is able to instantiate objects, read scriptable objects and is very extensible.It's also used for programmers or game designers to quickly create the "scaffoldings" of a scene and give that to the artists and designers, instead of having them to guess how long each clip should take, etc.Another interesting feature of Timeline is the ability to start or resume at any point very easily. Very handy in the case of the space shooter to test difficulty and level transitions for instance.Suggest downloading "Default Playables" in the Asset Store to get started with Timeline.Cygames: about optimization for mid-range devicesFeatures they usedSun shaftLens flare (with the use of the Unity collision feature for determining occlusion, and it was a challenge to set colliders properly on all appropriate objects, including for example the fingers of a hand)Tilt shift (not very convincing, just using the depth information to blur in post processing)Toon renderingThey rewrote the lighting pipeline rendering entirely and compacted various maps (like the normal map) in the environment maps.They presented where ETC2 is appropriate over ETC, which is basically that it reduces color banding, but takes more time to compress at the same quality and is not supported on older devices, and this was why they chose to not use it until recently.Other than that, they mentioned various techniques that they used on the server side to ensure a good framerate and responsiveness. Also they mentioned that they reserved a machine with a 500 GB hard drive just for the Unity Cache Server.Progressive lightmapperThe presentation was about their progress on the new lightmapper engine from which we already got a video some time ago (link below). This time, the presenter did apply that to a small game that he was making with a sort of toon-shaded environment. He showed what happens with the different parameters and the power of the new lighting engine.A video: https://www.youtube.com/watch?v=cRFwzf4BHvAThis has to be enabled in the Player Settings (instead of the Enlighten engine).The big news is that lights become displayed in the editor directly (instead of having to start the game, get Unity to bake them, etc.).The scene is initially displayed without lights and little by little as they become available textures are updated with baked light information. You can continue to work meanwhile.Prioritize view