Real-Time Syncing of API Documentation to ReadMe.io
This guide shows you how to sync API documentation to readme.io in real-time.
01.08.2024
Development

For an API to be successful, it needs to meet certain quality criteria, such as:
- intuitiveness, developer friendly, focused towards easy adoption
- stability and backward compatibility
- security
- very well documented with up to date documentation
Focusing on the last one raises a few questions:
- How do we document?
- How do we make the documentation public?
- How can we keep the API and the documentation in sync in real-time with minimal maintenance effort?
How do we document?
The sky is the limit here, but on the other hand there is also a clear standard to describe APIs which is OpenAPI Specifications.
For our services, the spec is automatically generated from the C# XML doc comments.
This will output a JSON or YAML standardised specification that can be interpreted by any other tool higher up the stack that supports the same standard. The ubiquitous swagger UI is the most well known example.
That is great for local development and internal access, but once the API goes live we need something publicly accessible by any consumer or by our partners.
How do we make it public?
This is usually done by uploading the specs in some kind of developer portal that supports OpenAPI Specs and offers support features like partner login, public access, high availability, user friendly UI, playground to try the API etc.
While there are a few examples and solutions, we will focus on ReadMe.io which basically takes as input an OpenAPI spec and layers a nice UI on top of it.
Keeping it all in sync:
The goal is, once a developer makes a change to the API, as soon as the code is deployed to production and ready to be consumed, the documentation magically updates at the same time without additional work. No copying and pasting of text, no manual editing, no updating of wiki pages etc.
The immediate go-to solution is to automate this process in the CI/CD pipeline.
What we need:
- tooling to push the file in readme.io – this is provided by readme
- an API spec file (json/yaml) – as input to the readme CLI
- automation in the CD Azure pipeline – to call the CLI whenever we have a code update
Here are the Azure pipeline tasks that do the trick:
- task: CmdLine@2
displayName: 'Install readme.io CLI'
inputs:
script: 'npm install rdme@latest -g'
- task: CmdLine@2
displayName: 'Update OpenApi spec in readme.io'
inputs:
script: 'rdme openapi https://$(PUBLICLY_ACCESSIBLE_HOST)/swagger/v1/swagger.json \
--key=$(RDME_KEY) --id=$(RDME_SPEC_ID)'
Copy code
There are a few important points here that can be solved in other ways depending on your setup.
The CLI is run from the context of an Azure pipeline, so the pipeline needs a way to reach the swagger.json file. Either you somehow:
- put it in an DevOps artifact and use it as a local file or
- just take it from a deployment location that is publicly accessible from the CI/CD pipeline
The last two arguments are the access key to readme.io and the API specification ID in readme.io, they can be easily obtained from the readme.io admin panel. If you have multiple APIs, you will have multiple different IDs. They are stored as locked variables in the Azure DevOps library.
Private infrastructure:
You might run into the case where your API deploys inside some infrastructure where Azure DevOps pipelines do not have access. This was our case:
We built our custom API gateway using YARP (which isn’t very compatible with swagger).
You can work around such a problem by exposing the swagger endpoint in the API gateway.
If you have multiple services, you expose them on different endpoints in the API gateway. YARP requires unique URLs for different endpoints.
For example:
- [
https://$](<https://$/>)(API_GATEWAY_CUSTOM_DOMAIN_HOSTNAME)/swagger/myAPI1/v1/swagger.json
- [
https://$](<https://$/>)(API_GATEWAY_CUSTOM_DOMAIN_HOSTNAME)/swagger/myAPI2/v1/swagger.json
Then implement a URL transformer in YARP to rewrite the URL to /swagger/v1/swagger.json
just before forwarding the requests to the private services.
This is enough for most scenarios, but there is yet another special case that can occur on top of everything else
Special scenario:
In case you have multiple APIs, you might keep the input/output models of that API in a separate nuget package that is referenced by all APIs.
In this scenario, the generated OpenAPI Spec will miss all the XML documentation that is part of the external nuget.
All the underlined lines will be missing by default!
The problem is that there is a bug in the .NET project system that prevents from copying the existing XML doc file located in the nuget package in the output folder of the API.
A few seconds later….

How to fix it:
- A bit of manual intervention is required inside the .csproj file:
Copy code
This will iterate over all dll files and will try to copy the respective XML files to the output directory, if the file exists and if it matches the name of our models’ nuget. Otherwise a lot more XML will be copied over and will needlessly increase the deployment size.
This will work beautifully locally, but it will not work in a docker environment where the image is built by Azure pipelines.
- To make it work from the pipelines, where you will likely use
dotnet publish
command, a new line is to be added after the initial copy operation.
Copy code
The only difference will be the destination folder, instead of the output path it will be the publish directory.
- The last touch is to add the
NUGET_XMLDOC_MODE
environment variable to instruct the restore operation to unpack the XML docs from the nuget packages together with the DLLs. This is set by default toskip
to shave a few seconds off the pipeline run time.
ENV NUGET_XMLDOC_MODE=none
RUN dotnet restore "myApiProject.csproj"
RUN dotnet build "myApiProject.csproj" -c Release -o /app/build
FROM build AS publish
RUN dotnet publish "myApiProject.csproj" -c Release -o /app/publish
Copy code
Wrap up:
To ensure the success of your API, don’t neglect the documentation side of it. Never rely on manual updates of wiki pages, Confluence pages or even using the UI editors that some tools and services put at your disposal.
Always strive to automate and correlate the API code deployment with updated docs deployment.
Depending on each particular case, the system and technology stack used and the tactics used, the hoops you might need to jump through to achieve this can vary – but the strategic goal should be the same.
Explore more
Domain Model and Security Principles of the re.alto API Platform
This article is intended as a guide/introduction for developers/architects using our API platform.
Selling Energy Data: Adopting an API-as-a-Product Mindset
API-as-a-Product: What to consider when marketing your API
New Feature: Guided Onboarding of EVs
This article looks at the benefits of one of our newer features: the guided onboarding of EVs to the re.alto platform.
Generating Stellar OpenAPI Specs
This guide by our development team shows you how to generate stellar OpenAPI specs.
re.alto Obtains ISO 27001 Certification (Revised 2022 Version)
We’re proud to share that re.alto has successfully completed ISO 27001 certification, this time for the new, revised version of the standard.
New Feature: Charge Sessions API
We’re excited to announce that our EV connectivity platform now has a new added feature available: the Charge Sessions API.
Scaling with Azure Container Apps & Apache Kafka
This article, written by re.alto’s development team, explains how to scale with Azure Container Apps and Apache Kafka.
Containerisation: Introduction & Use Case
An introduction to the technologies used in containerisation (by re.alto's dev team)
Remote EV Charging via Official APIs
Remote Charging via Official APIs (Mercedes Benz / Tesla Connector)
Vehicle IoT: an Alternative to Smart Charge Poles
Vehicle IoT: an alternative to smart charge poles / smart charge points.
Alternative APIs for Dark Sky
Alternative APIs for Dark Sky and the strategic value of weather forecasting data in energy.
The First Open Data List of European Electricity Suppliers
A database of information about electricity suppliers in Europe.
A Guide to Monetising APIs
A look at the potential to monetise APIs (and existing digital assets and data).
What is an API?
The term API is an acronym. It stands for “Application Programming Interface.”