How To Make An AWS-like Service Accessible Through boto3?

A product or service on AWS may rely on countless AWS native services. However, quite often it is not enough, and bespoke built or configured systems are deployed. Usually, to use such a system a custom permission management is needed, custom authentication protocols have to be deployed, custom APIs are exposed, and used with custom client libraries, even though everything runs on AWS.

In October 2019, I had a lightbulb moment. What would it take to write a custom service on AWS and use it as an AWS service, through boto3 client? Codename “Elastic Unicorn Service”, like this:

import boto3

# Create Elastic Unicorn Service (EUS) client and
# print the name of unicorn with id 'u-00001'.
eus = boto3.client('eus')
unicorn = eus.get_unicorn(UnicornId="u-00001")
print(f"UnicornName = {unicorn['UnicornName']}")
print(f"HornLengthInFeet = {unicorn['HornLengthInFeet']}")
# Output:
# UnicornName = Calypso
# HornLengthInFeet = 3

After a brief period of going through the botocore source and experimentation, I managed to build a working prototype of “Elastic Unicorn Service”, and I am really excited to share the results in this blog post. I will dive into each part individually below, but it is surprisingly not hard to achieve this. There are only two principal components in this prototype:

  1. botocore loaders
  2. Amazon API Gateway service.

botocore loaders

At the time of writing this blog post Wikipedia page claims that “in 2020, AWS comprised more than 212 services.” So it may not come as a surprise that managing an API client for each service would be a behemoth task, and there are no S3Client or EC2Client classes in the code. Instead, boto3 uses botocore to read something called service model files. They describe the AWS service API call structure and are read by a class called Loader. This information is used to generate a client class at run-time (see this method). botocore also provides an event system with ample points of extension, to customize the generated classes. I have not looked at non-Python AWS SDKs, but presumably, client code generation is automated to some degree.

There is not a lot of information about the structure of the model files, apart from the explanation of the directory layout they are in, under loader documentation. But browsing through the existing service models as well as turning on boto3 debug output with boto3.set_stream_logger('', logging.DEBUG), can give an idea of what’s what.

For the purposes of this experiment three things are important:

  1. endpoints.json file, like this one, contains the common attributes of AWS services - hostname pattern, signature algorithm, regions in which services are present and various custom overrides.
  2. service-2.json file, like for AWS Config service. It defines the API calls and the structures for inputs, outputs, and error responses.
  3. AWS_DATA_PATH environment variable or ~/.aws/models/ directory, where botocore looks for the above files to be able to generate the client classes.

I won’t go into a lot of detail about the structure, because I could not do a good service here - it is not documented and I just worked out most of it. Some of the properties and values were self-explanatory. When not - a good look at the source code and debug output usually revealed more detail. Instead, I will provide final examples.

API Gateway

The API Gateway service is used to provide the AWS IAM authentication and permission management. The actual API running behind API Gateway could be hosted anywhere. That is it.

The Code

My original experiment in 2019 used API Gateway MOCK integration to return a static response. While writing this blog post I realized I wanted to make the experiment more interactive. Hence, I have added an inline Lambda to provide a very basic Elastic Unicorn Service API with two calls: GetUnicorn and DescribeUnicorns. A short recording of how it looks in action:

The repository for the “Elastic Unicorn Service” can be found on GitHub and the README file should provide enough instructions on how to try it out.


In the end, this remains an experiment and I did not apply it anywhere. Mostly, because the line between whether this is a little known low-level API or an undocumented internal API is very blurred. It would not have been prudent to build out a service on an internal/undocumented API.

If this would be accessible, I can see a few immediately appealing benefits:

  1. No need for custom authorization or authentication backends. Permissions can be controlled using identity IAM policies for API execution. Consider a policy statement like this attached to a role or user:

    - Effect: Allow
      Action: execute-api:Invoke
      - arn:aws:execute-api:eu-west-1:012:api-id/Prod/GetUnicorn
      - arn:aws:execute-api:eu-west-1:012:api-id/Prod/DescribeUnicorns

    Granted, it is not as nice as a named permission action could be (Action: eus:GetUnicorn), but it allows as granular permission management as the real AWS services.

  2. The botocore Stubber infrastructure comes “for free”, with validation against the service model, simplifying unit testing.

    import boto3
    from botocore.stub import Stubber
    eus = boto3.client("eus")
    def eus_stub():
      with Stubber(eus) as stubber:
        yield stubber
    def test_get_unicorn(eus_stub):
     expected = {
         "Unicorn": {"UnicornId": "u-abcdef0123", "UnicornName": "Prongs"},
         "get_unicorn", expected, {"UnicornId": "u-abcdef0123"},
     actual = eus.get_unicorn(UnicornId="u-abcdef0123")
     assert actual == expected
  3. Increased consistency between client libraries within the AWS Cloud.

  4. There is no client code written at all. It is generated at run-time from service model specification.

Two immediate cons are:

  1. Lack of documentation about the custom service models structure.
  2. Lack information, whether the custom service models are considered a public API or not.

However, I do find this pattern of building AWS-like services on AWS very intriguing and I wonder what the official botocore maintainers would say about this. As far as I am aware, or was aware back in 2019, this has not been tried publicly before. I am intrigued to hear what you think. You can get in touch on Twitter or LinkedIn.

Recent articles