Are you currently experiencing an attack?

Are you currently experiencing an attack?

Security Challenges of Serverless Architectures

Using serverless has many benefits: you can avoid provisioning servers, and you don’t have to update or patch any server operating systems. These tasks become the responsibility of the cloud provider. But this doesn’t mean that serverless applications are devoid of any security issues. In fact, they have their own special challenges, some of which we will analyze in this article.

Serverless is a relatively new term that is becoming more and more popular. The concept of serverless has been around since 2014 when AWS Lambda was launched–thus, many people think of functions and functions as a service (FaaS) as being serverless. If you are not familiar with FaaS, the function-as-a-service platform allows you to run, operate, and deploy code without the complexity of managing infrastructure. Functions are simply a piece of code that you deploy to the FaaS platform–a piece that can be very simple and can do only one thing.

However, serverless is not only FaaS. Serverless services offer three promises to their users: to pay only for what you use, to have no need to manage infrastructure, and to have the ability to scale up and down automatically according to traffic. Therefore, when we talk about serverless, we’re also talking about all of these managed services that are provided by cloud platforms and many other smaller providers. These latter include Auth0 (authentication and authorization), Algolia (free text search), and MongoDB Atlas (noSQL database).

The increased usage of managed services in our systems is not something new. S3 was one of the first managed services in the cloud and dates back to 2012. So, serverless security problems do not just exist for people using functions. They are also an issue for everyone using managed services in one way or another in applications.

In traditional architectures, the major security risks are in the code and the environment where that code is running. In contrast, the biggest risks in serverless applications are the configuration of the different services and the integration between them. This is where most of the attacks can happen, including those that we will discuss below.

Complexity of the Entry Points

Because serverless applications are event-driven, there is a wide range of event sources being consumed in the application. This increases the attack surface of the application, as the message structures are complex and hard to validate. Developers, in general, are not aware of how to protect their code from all possible attacks due to the complexity of the entry points and because they belong to multiple parts of the application.

For example, one of the most common attacks that can result from this challenge is injecting malicious information into events. This means that an untrusted input is passed directly into your code without getting evaluated. When a function consumes an event, that input might then include parts of the event object that are controlled by an attacker and thus pose a danger to the application.

How to mitigate this: Always verify inputs; don’t trust anything that comes from outside the application. Never pass input information to another service without validating it. Finally, make sure that the function running this code has the fewest and lowest possible permissions — something we will talk more about in a moment.

Complexity of the Configurations

Serverless applications tend to be complex distributed systems, so these systems are vulnerable to a lot of attacks due to poor configurations. Relevant attacks include the exploitation of over-privileged functions, and exploitation of incorrectly-stored application secrets.

Exploiting Over-Privileged Functions

Serverless functions should always follow the principle of least privilege; your function should only be given permissions that are essential for it to perform its logic. It is important to give the least number of permissions for the least amount of resources as possible.

Managing permissions tends to become a difficult task, as the granularity of the permission definitions will become complex for each function. When there are more permissions than needed, the attacker can perform malicious operations in your system and take over control of other resources.

How to mitigate this: Give the least amount of permissions possible to each individual function. There are many tools today that can help you define permissions for each cloud provider, such as the AWS IAM best practices checklist. If you are using the Serverless Framework, you can use the PureSec plugin to create the exact IAM roles desired for your project. For Microsoft Azure, you can use shared access signatures, and GCP also has a recommended way of assigning privileges to resources.

Exploiting Incorrectly-Stored Application Secrets

This problem is also quite common in traditional architectures. Frequently, secrets are (incorrectly!) stored in a configuration file in plain text, making them vulnerable to unauthorized access. The worst scenario is when this configuration file ends up in a public repository. Another flawed approach is to store secrets as environment variables in plain text, because these variables can be easily read by others and leaked.

How to mitigate this: Store your secrets as encrypted text and keep the key for this encryption in a secure place. Most cloud providers offer some way to manage encryption keys, including the AWS Key Management Service that you can use with services such as the AWS System Manager Parameter Store. This store lets you use environment variables to pass along those secrets, encrypt them, and make sure that decryption happens during runtime. For Microsoft Azure, you can use the Key Vault. For GCP, you can use Cloud KMS for encrypting secrets that you can later store somewhere safe. If you are using the Serverless Framework, the new Enterprise version has built-in secret management as well.

Inadequate Testing

Performing security tests in a serverless architecture is more complex than in traditional applications. Because of this complexity, testing tends to be done poorly, leading to many possible attacks. Potential problems include third-party malicious dependencies and improper error handling.

Third-Party Malicious Dependencies

It’s a good practice to use the lowest possible number of dependencies in your functions. The more dependencies you have, the bigger the deployment package will be, and the longer the function will take to start executing. Dependencies are useful when you don’t want to reinvent the wheel, because they allow you to add different libraries of existing functionalities to your project. For example, if you need to encrypt a string, you could write the whole encrypting logic in your code, or you could simply use an existing library that is tested and proven to solve this problem. Dependencies can save a lot of time if you pick the right ones, but they also can be risky if you import malicious ones.

It’s common to import a few dependencies into your functions. Just be aware that some might have vulnerabilities, which will add risks to your application.

How to mitigate this: Keep track of all the dependencies that your system is using, validate them against libraries of known vulnerabilities, remove unnecessary dependencies, and don’t forget to update them regularly.

Improper Error Handling

Serverless is a distributed system and, in general, developers tend to perform the integration between services with a “happy flow” (a flow without errors). If some errors are caught, they are the most common ones. It’s hard to test for all potential errors, and sometimes cloud providers do not have documentation available for all possible errors that a system might throw.

Attackers can benefit from this by creating erroneous situations that a system is not prepared to handle, since those flows were not designed by the developers. The errors can also provide debugging information that the attackers can use to obtain privileged information about the system.

How to mitigate this: Make sure that you are handling all errors. Run some chaos engineering tests, use a monitoring tool that provides live debugging of the application, and don’t be too verbose in your error messages.

Inadequate Security Tools

Traditional security protection tools don’t work very well in serverless applications. It’s not possible to add security layers to protect the underlying application, since there is no access to the server or its operating system. This can lead to many possible attacks, such as exploits due to the lack of good monitoring and logging tools and denial of service.

Lack of Adequate Monitoring and Logging Tools

The lack of good tooling can prevent an operation team from having proper visibility into their system and being aware of potential or real attacks. This makes it easy for an attacker to take advantage of these deficiencies and launch an attack.

How to mitigate this: Keep good logs, log all entries to your system, log any changes in the configuration, and send alerts when there is anomalous behavior in the system. Also, it is advisable to use some serverless monitoring tool for tracing and debugging your serverless applications.

Denial of Service

Yes, denial of service is possible in serverless applications; it’s known as “resource exhaustion.” Most cloud providers offer some soft limits on the number of available resources; this prevents the customer from having to pay a high bill if an error occurs or their system is attacked. For example, there is a maximum amount of concurrent functions that can execute at the same time. After that number is reached, the next events won’t execute until some resources are freed up.

How to mitigate this: Set appropriate timeouts when calling different resources, and throttle requests. AWS has a default limit of 1,000 for the total number of concurrent executions across all functions. Therefore, if there is some function that is more vulnerable to this kind of exploit, you can limit the number of concurrent functions, so that one function does not take over all of your maximum concurrent function executions.

Conclusion

Serverless reduces many security risks in traditional applications, but it brings new problems as well. Just as there are the OWASP top 10 security risks for traditional applications, there is a similar document for serverless applications. You should review this guide periodically, since it will change over time as new risks are discovered.

Reblaze provides robust cloud-native security that works just as well for serverless as it does for more traditional architectures. Here’s how to contact us to learn more or schedule a demo.

Get your price quote

Fill out your email below, and we will send you a price quote tailored to your needs

This website uses cookies to ensure you get the best experience on our website.