Talk to most Java developers, and they will tell you that Java is the most secure programming language out there. But, of course, they would say that.
The truth is that whilst Java made huge advances over older languages – particularly C and C++ – when it comes to security, the level of vulnerability of code written in Java depends on programmers following best practices.
This is particularly true in today’s development environment. A variety of new security techniques, hacking techniques, and novel forms of storage and encryption mean that many are questioning the old certainties when it comes to the security of Java. Among the new challenges faced by Java developers are the security concerns of cloud migration. On the other hand, new security auditing techniques like chaos engineering present many opportunities for developers to increase the security of their code.
In this article, we’ll take a look at five principles that should be followed when coding in Java in 2020. Ideally, these principles should be integrated into a DevSecOps process, in which security is built into development from the ground up, but they are equally useful for auditing legacy code.
1. Audit your libraries
Let’s start with the most obvious source of vulnerability for software built on Java: external libraries. For the vast majority of projects, whether written in Java or any other language, libraries compromise the majority of code. Unfortunately, many dev teams can’t provide a list of the third-party libraries that they use.
Using external libraries is not a problem in itself, of course. Indeed, in 2020 most of the time of most developers is spent working with third-party libraries. As clients’ needs develop, what started out as a simple piece of accounting software will have libraries added to realize a markup calculator, tax return generator, and ROI statistics package. What is important, each time a new library is added, is for devs to carefully check these libraries for known vulnerabilities.
Auditing your libraries is not only good for security, either. Whilst you are auditing, you might find other issues that are adversely affecting performance. And if the library you are auditing is open source, you can then use the opportunity to report the bug and build your team’s reputation across the open source community.
2. Manage Application Secrets
Java developers have also fallen into some bad habits when it comes to managing application secrets. In fact, the community can be broken into two camps: those who sacrifice security in order to give users the smoothest possible experience of their software, and those who expect users to spend 4 hours inputting credentials for their own good.
In reality, coding in 2020 means striking a balance between security and usability. Too much focus on usability often leads to insecure code. Too much focus on security means that your users will spend most of their time trying to get around the security measures you have put in place. And, as any experienced developer knows, users will eventually find a way around every security measure you put in place.
When it comes to managing application secrets, Java developers can learn a lot from reading about the differences between CMS platforms. That might come as a shock to some, but hear us out. The huge user bases off most CMS platforms means that their developers have had to carefully think through how to manage application secrets, whilst maintaining usability for the average user.
Many of these platforms make use of a high-quality key management service like AWS KMS, which ensures that the code secret it’s storing doesn’t live in memory any longer than it has to. Approaches like this, though denigrated by many “serious” Java developers, would be a welcome addition to a lot of Java software.
3. Use Mature Encryption Libraries
One particular kind of library should be audited and analyzed more often than others: those you use for encryption. Java libraries for encryption have historically been extremely difficult to work with, with APIs that are less than helpful for the average developer.
Unfortunately, this has led many Java devs to take matters into their own hands, and write their own encryption libraries. In fact, many in the community take pride in their home-brew encryption, and are skeptical about using code written by someone else. This is a huge mistake. There are developers who spend their entire working lives making unhackable encryption libraries, and trust us: theirs are better than yours.
The best way to approach encryption in Java is to use the built-in tools that the language gives you. There is no point – and significant downsides – to reinventing the wheel (see below). The libraries that ship with basic Java have been proven secure, and yours haven’t.
4. Validate Your Inputs
As we’ve noted above, a large part of programming in 2020 is making sure that your users don’t break your lovingly crafted software. One of the easiest ways of doing this is to spend some time validating user inputs. If done correctly, a careful approach to this has two huge advantages; not only does it make your applications more secure, but it also makes them easier to use.
In 2020, we should also remember that “user input” doesn’t only come from humans. The Mirai botnet has illustrated the dangers of developers not properly validating inputs from IoT devices, and this is going to become even more critical in the coming decade.
Like many of the principles on this list, when it comes to user validation devs can simply make use of the tools that ship with Java itself. The scanner library provides a basic implementation of validation that can constrain user inputs with an eye to security. Using validation tools like this means that you don’t have to write complex, custom validation logic. If you don’t want to, that is.
5. Don’t Reinvent the Wheel
Finally, a catch-all principle that applies to all developers, in all languages, everywhere. Don’t make your own version of something that is readily available.
Given the frequent headlines about vulnerabilities being found in widely-used libraries, mand developers feel that they are better off making their own libraries. If no-one else edits the library, goes the logic, then no-one else will be able to find vulnerabilities in it. The problem with this is that obscure code is not inherently more secure than publicly available code. In fact, the main difference is that with thousands of people checking open-source libraries, vulnerabilities are found quickly. Better the devil you know, and all that.
This principle applies to almost everything you do as a Java developer. We’ve seen instances in which developers spend months making their own encryption protocols (see point 2) for sending application data across the web. That might be fun, but here’s an idea: just use a VPN service. It will take seconds to install, and you can get on with more important things.
Unfortunately, perfect security is impossible. You will always mess up and write insecure code sometimes, and it’s easy to fall into the habit of using insecure libraries.
The key to ensuring security in Java development is to have a system for checking for security vulnerabilities, and shutting them down. Given how Java will develop in 2020, it’s more important than ever that developers scan the horizon for new security threats, and be ready to respond to them. Already there are serious questions about whether the cybersecurity industry keeps up with 5G, and the IoT presents another challenge.
Above all, Java developers should realize that ensuring security in their code is a process, and not an event. You might have made an application secure when it was first released, but what about the code that others have added to it? And what about the ways that your users are actually using it?
All of these questions need to be addressed through careful auditing throughout a program’s lifecycle. Rather than writing your code, sending it off, and getting on with the next project, your team needs to have in place rigorous monitoring processes that ensure it stays as secure as the day it shipped. Your users will take a long time to forgive you if it is your program that led a data breach, so make sure you take that responsibility seriously.