Code-Held

Blog about Spring, Java, JVM, Clean Code, Continuous Integration, Docker, Unity and many more topics from my daily work
Code on screen dark

I a project I worked on I saw that nearly every entity and value object was created with Lomboks @Builder. Their reason is that it makes it easier to construct these objects - especially for tests. But it comes with a cost. The problems that these builders create can’t be detected by the compiler and are especially dangerous in every CI environment.

Unit tests had a bad reputation in many teams I worked with. To my confusion I even experienced that a team wrote a hell lot of integration tests but rarely any unit tests. This is contrary to the well-known testing triangle, and surprised me quite a bit. The reason - as I was told - was the experience of the team. “When you change something you have to adapt many unit tests”, was the common tenor. So they decided to write integration tests which call the real endpoints instead. This comes with a cost. Generally you have a harder time to identify where a issue lies when an integration test - which goes through the whole system - fails. Also the runtime will not lead to fast feedback - one of the main benefits of having proper unit tests. I discovered that the issue is that many developers never learned how to write proper unit tests. In this post I will cover the best practices that I developed throughout my career.

Spring Data provides an easy way of keeping track who creates and modifies a persistent entity as well as when the action happened by annotating properties with @CreatedBy, @CreatedDate, @LastModifiedBy and @LastModifiedDate. The properties are automatically provided by an implementation of the AuditAware and DateTimeProvider interface.

I work with Jenkins Pipeline for three years now and one pain point is proper isolation of shared functionality between pipelines but even steps. In our repository we defined multiple pipelines and some are that large that we share functionality within it. Jenkins offers the possibility to create shared libraries for that purpose. But unfortunately it’s not possible to load it from the same repository. Since many of the changes in the pipeline are related to a change of the shared library it was tedious to match the branches and versions to be backward compatible. What we actually wanted is having the shared library in our main repository so the states of the pipeline is pinned to the state of the main repository. And we finally figured out how to do (hack) that.

In this post we’ll go through an example application and see which methods and principles we can apply to build a robust application that is easy to maintain, extend, understand and use. In general, this is a subject with a much larger scope than a simple blog post can provide, so the content is neither complete nor exhaustive, but a selection of topics that I visited recently. We use an arbitrary business case where we can buy and sell resources on a market. Our example is implemented with Java using Spring Shell for a simple frontend representation. However, please keep in mind, even if technology you use operates differently, the principles stated in this post remain true for every language. Without further hesitation let’s start with an example of how our application works:

At some point it hits everyone. Your precious work of several hours is vanished because of a hardware failure. For that the industry came up with several solutions like the idea of version control systems. But recently I was dumb enough to not do any commit for several hours - just to leave the history clean. Big mistake… But did you think about automating that? How about committing every couple of minutes automatically? In this post I’ll explain this - on the first sight - weird workflow.

Code on screen

About Me

Picture of Marcus Held

Marcus Held

Since July I work as a principal software engineer for grandcentrix. Previously I worked for Innogames where my main responsibility was the development of a highly scalable backend with Java and Spring that was capable of handling millions of players worldwide. Furthermore, I took care of their Jenkins for continuous integration, their build pipeline with Gradle and all technical challenges that we face to get our game out into the world.

At the age of 12, I had my first experience with programming. At 15 I started working professionally for companies and by 18, prior to finishing school, I founded my first company: Web as Art. I studied computer science at Hochschule Bonn-Rhein-Sieg and live in Hamburg with my wife and two sons.

Boost Your Development With Proper API Design code.talks 2019

With 1600 attendees is code.talks the largest developer conference in Germany. Around 400 people listened to my talk where I discussed several aspect of robust software architecture design.

more_horiz
Rise of Cultures
Rise of Cultures close

What is it?

Rise of Cultures is a simulation game where you guide your civilization through the ages, meet other cultures, conquer continents and built up your cities.

What did I do?

I developed and maintained the backend, which we developed with Java, Spring and Hibernate. When I worked on the project we released it for the first time on the US market to test early retention KPIs.

more_horiz
Sunrise Village
Sunrise Village close

What is it?

Sunrise Village was a character driven simulation game in which the player built up a village and explored the world with his character. The game featured a rich exploration of the world and an extensive production simulation.

What did I do?

I developed and designed the server from the early days of production with Java, C#, Spring, Hibernate, RabbitMQ and .NET Core. One of the main features of the backend was a .NET Core application that used the same business logic as our client. The backend was capable to simulate multiple players moving on the same map.

more_horiz
Gates of Epica
Gates of Epica close

What is it?

Gates of Epica was an action RPG developed with the Unreal Engine 4 for iOS and Android. In the game the player fought for loot and glory in more then 600 hand-crafted missions and joined glorious multiplayer boss fights where many players fought a boss for days.

What did I do?

My responsibility was the development of the backend and game logic. We used Java with Spring Boot, postgreSQL and hibernate.

more_horiz
Legends of Honor
Legends of Honor close

What is it?

Legends of Honor is a massive multiplayer online strategy browser game. In the game you take control over a medieval kingdom and move through the world with your army in real time.

What did I do?

When the project was started I joined it right away as the first backend developer. In this role I had the technical responsibility to design the server architecture and lead a team of 10 backend developers until the launch of the project.

more_horiz
Shadow Kings
Shadow Kings close

What is it?

As a successor of Goodgame Empire it was planned to target a more casual audience with a similar gameplay. Shadow Kings got released on PC, iOS and Android.

What did I do?

I took over the project in the last months of its existence as the first backend developer. In this time it was my responsibility to lead a team of 6 backend developers and to fork off the server from its origins in Goodgame Empire.

more_horiz
Goodgame Empire
Goodgame Empire close

What is it?

Goodgame Empire is a massive multiplayer browsergame with more than 70 million registered players. As a player you build up your castle to rule over four different kingdoms.

What did I do?

I started working on Empire 2014 and was one of the main backend developers of one of the two feature teams we operated. In this project we operated a Java based server which handled thousands of concurrent users with a high amount of requests per minute.

Technologies

Tools