Believable Worlds 🌍, Fortnite City 🏙️, UE5 Rigs 🎮

Dec 19, 2025

🌍 Crafting Believable Worlds

Stop Scatter-Detailing: Three Rules for Truly Lived-In Worlds

Wargaming’s Dmitrii Kislitsin explains why randomly scattering “cool details” makes environments feel artificial instead of alive. He shares three principles for believable worlds: follow the logic of entropy when placing wear and dirt, ensure every element has a clear function, and build a shared story at the scene level, not just per prop. From cable routing to scuff paths, every mark must be justified. The result is systemic chaos that feels truly lived-in.

Building Playable Places: How a Real City Came to Fortnite

Tomlin Studio set out to answer a bold question: what if a city’s digital twin could be played like a game? In this session, they detail how they brought Norrköping into Fortnite using UEFN, balancing performance limits, licensing, and municipality requirements for non-violent content. The talk compares photogrammetry vs Gaussian splats, shows how tools like Cesium, OSM, and genAI fit together, and ends with a drone-based city race demo. It’s packed with lessons for anyone eyeing real-world game spaces.

🧠 Tech & Physics for Game Devs

From Newton’s Laws to Moving Particles: Physics for Game Dev

In this video, the creator sets up the initial architecture of a C++ physics engine for game development. A PhysicsSystem static library manages PointMass particles, updating their positions every frame using Newtonian mechanics and the game’s delta time. A clean interface to the rendering system is achieved via an UpdateListener function, so physics never directly depends on engine-specific scene objects. With a simple OpenGL-based visualization, you can already see objects accelerate, fall under gravity, and sync perfectly with the render loop.

Mastering UE5 Modular Rigs: IK/FK and Facial Controls with Agora Characters

Explore how Agora’s high-quality modular rigs turn UE5 into a serious character animation workspace. The instructor walks you through dropping modular rig assets into a level, auto-creating level sequences, and priming rig logic with a single “key everything” pass. You’ll see how to flip limbs between IK and FK, reveal and manipulate tweaker and hidden controls, and drive expressive facial animation with jaw, lip, and mouth “zipper” controls. Because all the characters share the same modular system, mastering one means you effectively understand them all.

🛠️ Tools, Pipelines & the Future

JangaFX 2026 Roadmap: EmberGen 2.0 and Big VFX Upgrades

JangaFX has published its 2026 public roadmap, outlining major updates for EmberGen, LiquiGen, and IlluGen and turning the page into a central hub for changelogs, bugs, and feature requests. EmberGen 2.0 is a full ground-up rebuild with a new GPU particle system, scrubbable sims via volume caching, and an improved combustion model. LiquiGen 1.1 focuses on complex liquid flows, UV mapping, and white water rendering. IlluGen 1.2 brings its shader workflow to Unreal, Unity, and Godot, plus robust 3D-to-2D export options.

Blender Asks Every User for €5 to Fully Fund 2026

Blender’s end-of-year fundraising push is off to a strong start, with more than 27,000 one-time donations—ten times the normal monthly amount. To fully secure its 2026 budget, the Blender Foundation is encouraging every active user to contribute a €5 donation or amplify the campaign. Alongside the funding news, Blender 5.0 has just received its first corrective update, addressing 132 issues and reinforcing the project’s commitment to polish and stability.

Never miss an issue!

Subscribe to get daily game dev insights, news, and more—straight to your inbox.

No spam. Unsubscribe anytime.