Welcome to the Sales API.

Sales API is a collection of all the APIs needed to perform a sale ranging from the creation of Orders and receiving Payments to sending Receipts to the customer. In this section we will document the different available APIs and guide you through your first sale.



All endpoints in the Sales API requires an authentiation header. The authentication header must be an Entur-issued OAuth2 bearer token. In order to Issue a valid token you must have a client_id and client_secret. (TODO)

curl --request POST \
  --url 'https://<type_of_token>.<environment>' \
  --header 'content-type: application/json' \
  --data '{"grant_type":"client_credentials","client_id": "<clientid>","client_secret": "<clientsecret>","audience": "https://api.<envirnoment>"}


Most endpoints also requires a header describing which point-of-sale (POS) the request originates from. This value is represented by the header Entur-POS. The Point of Sale header indicates where the product was purchased. This is a convenience header for the client, and it is up to the client using the API to decide the content for it.
All requests will also have a correlation id which is pertained with a request as it propagates through the system. This value will show which request is being processed as it flows through the system. This value is automatically created and maintained by the internal logging library, and should not require any further actions from client developers.  
Description Header key Example value
Point of Sale Entur-POS Entur-APP
Correlation Id correlationId
Authorization Authorization  Bearer xxxxxxx.yyyyyyyy.zzzzzzzzz



The Entur Sales API follows a unified versioning scheme for the entire service portfolio, meaning that all services within the API should expose the same endpoint versions.
Bumping this global version number indicates a breaking API change, and thus should happen infrequently.

The API version consists of a "v" prefix, followed by the API version number, e. g. "v1", "v2", etc.

API consistency:
All endpoints for a specific API version should guarantee API consistency – a client should be confident that the API will not change, as long as the version remains constant.
For development purposes, endpoints subject to changes should maintain a beta version number throughout development, i. e. "/v1beta/...".


Some of the endpoints for the API will potentially return large arrays of data, which could be taxing for the client. For this reason, the client must use pagination to limit the number or results returned from the API.
Specifying current page and page size

To specify which page to be returned from the API, use the page query parameter in the URL. There is a default page size set for the relevant services, but this size can be overridden by specifying the perPage query parameter.
In addition to the items for the current page, the response will contain the total item count (totalItems) and the total number of pages (totalPages).

This is an example of a query to /sales/v1/payments?page=1&perPage=4:


    "items": [
            "paymentId": 2396,
            "totalAmount": "1446.00",
            "currency": "NOK",
            "orderId": "ABCDEFGH",
            "orderVersion": 1,
            "transactionHistory": [],
            "links": []
            "paymentId": 2433,
            "totalAmount": "152.00",
            "currency": "NOK",
            "orderId": "HIJKLMNO",
            "orderVersion": 1,
            "transactionHistory": [],
            "links": []
            "paymentId": 2434,
            "totalAmount": "152.00",
            "currency": "NOK",
            "orderId": "PQRSTUVW",
            "orderVersion": 1,
            "transactionHistory": [
                    "transactionId": 2167,
                    "currency": "NOK",
                    "paymentType": "VISA",
                    "cardNumber": "XXXXXX******XXXX",
                    "summary": {
                        "remainingAmountToCapture": "0.00",
                        "capturedAmount": "152.00",
                        "remainingAmountToCredit": "152.00",
                        "creditedAmount": "0.00"
                    "amount": "152.00",
                    "status": "CAPTURED",
                    "links": []
            "links": []
            "paymentId": 2438,
            "totalAmount": "152.00",
            "currency": "NOK",
            "orderId": "XYZABCDE",
            "orderVersion": 1,
            "transactionHistory": [
                    "transactionId": 2171,
                    "currency": "NOK",
                    "paymentType": "MASTERCARD",
                    "cardNumber": "XXXXXX******XXXX",
                    "summary": {
                        "remainingAmountToCapture": "0.00",
                        "capturedAmount": "152.00",
                        "remainingAmountToCredit": "152.00",
                        "creditedAmount": "0.00"
                    "amount": "152.00",
                    "status": "CAPTURED",
                    "links": []
            "links": []
    "totalItems": 16,
    "totalPages": 4

Query filters

Paged endpoints can also contain a form of filtering queries. These are query parameters which can be appended multiple times to build up a more complex filter.

Filter operators

  • eq - equal to
  • ne - not equal
  • gt - greater than
  • gte - greater than or equal
  • lt - lower than
  • lte - lower than or equal


/orders?totalAmount=gte:1000.00&organisationId=eq:1 would return all orders where total amount is greater than or equal to 1000.00 and has organisationId equal to 1.


The Entur Sales API has a standardised way of representing value of products. Firstly, a currency code must always be specified for a given product (e. g. "NOK", "USD", etc). Secondly, the amount for the given currency is represented as a numeric string with two decimal points (e. g. "100.00").

The example below shows how an order costing NOK 1446.00 is represented in the API. Note that many fields has been omitted from the order structure for better readability.

    "id": "ABCDEFGH",
    "version": 1,
    "status": "CONFIRMED",
    "totalAmount": "1446.00",
    "totalTaxAmount": "154.93",
    "balance": "0.00",
    "currency": "NOK",
    "paymentRefs": [
            "orderVersion": 1,
            "paymentId": 6406,
            "amount": "1446.00",
            "currency": "NOK",
            "completedAt": "2018-09-03T12:47:18Z"
    "travellers": [],
    "reservations": [],
    "organisationId": 1,
    "createdAt": "2018-09-03T12:44:27Z"



All dates returned from the Entur Sales API follows the ISO 8601 standard, i. e. "YYYY-MM-DDTHH:mm:SSZ".

Note that time is always represented in UTC, so the developer needs to convert it to the correct local time before presented to the end user.
2018-12-24T18:00:00Z is an example of a valid timestamp returned from the API.




  1. Complete your first sale with the sales API
    1. Get Offer
    2. Get Offer Details
    3. Create Order
    4. Reserve Offer
    5. Create Payment
    6. Create a Payment Transaction
    7. Create Terminal
    8. Capture Transaction
    9. Generate ticket distribution
    10. Distribute ticket distribution
  2. Searching for Orders
  3. Configuring Payment Methods
  4. Accessing the external Kafka Cluster
    1. Prerequisites
    2. Quickstart
    3. Security
    4. Testing Client Configuration
    5. Topic description
    6. Schema registry

Complete your first sale with the sales API

This guide shows how to create an order, populate it with an offer, pay for the order and distribute tickets. Each code step contains a request body (if needed) and the response. If the request body is missing then the request can or should be executed without a body. It is recommended that beginners of the API try this example first.

Get offer

Get an offer for a trip

Example request: Oslo S (NSR:StopPlace:337) - Bergen (NSR:StopPlace:548)

 Example request
 Example response

Get offer details

Get offer details. Use the id field of the tripPattern from the previous response to retrieve details about that offer.

Example response

Create Order

Create an order which you want to add offers to. This is done by sending a post request with an empty body.

 Example response

Reserve offer




Reserve the chosen offer and add it to the specified order

 Example request
 Example response

Create Payment

Create a payment to the order.

 Example request
 Example response

Create a Payment Transaction

Create a payment transaction to the specified payment

 Example request
 Example response

Create Terminal

Create a terminal to execute the transaction. Use the paymentId and the transactionId from the previous requests.

 Example request
 Example response

Capture Transaction

When the transaction is executed successfully we need to mark the transaction as completed. Use the paymentId and transactionId from the previous requests.

 Example response

Generate ticket distribution

Generate a ticket distribution. Use the orderId from the previous requests.

 Example response

Distribute ticket distribution

Distribute ticket distribution. Use the orderId from the previous requests.

 Example response


Searching for orders






Searching for orders can be accomplished by using multiple filter queries to narrow down a search. Complex filters can be created by using multiple query parameters, even by name. All query parameters are joined together using the operator AND.

In the example below a complex query is created by using multiple query parameters to find all orders in status DRAFT, version 2 which has a total amount greater or equal to 100 and lower than 200.

GET /sales/v1/orders?status=DRAFT&version=gt:1&totalAmount=gte:100.00&totalAmount=lt:200.00


Configuring PaymentMethods

In order to be able to complete payments using the Sales API, each clientId must have configured which Payment Methods they should have access to.

PaymentMethods are defined by two parameters. PaymentTypeGroup and PaymentType


Each Payment Type must be configured for the client by Entur. Please contact your Partner Assistant if you need any Payment Methods to be configured for your clientId.


Accessing the external Kafka Cluster


In order to access the external Kafka Cluster the following things need to be prepared:

  • A username and a password for the Kafka cluster
  • Organization-specific topis for your organization needs to be created

If you don't have these things in order, please contact your Partner Assistant in order to obtain access. You will not be able to connect and test your implementation without this.

Before starting implementation we recommend that you familiarize yourselves with the following documentation:


As the Confluent documentation shows, there isn't that much configuration required to get started with a Kafka client. We have highlighted the few configuration properties required to be able to connect and communicate with Entur's external Kafka cluster.


List of servers on the Kafka cluster used to bootstrap connection. All brokers for each environment is shown below.


Note: The standard client port is not open, as we require every client to use a secure connection via SSL and SASL – see Security for more information. 

Collective identification of all clients that belong to the same application. All instances of the same application must have the same group id. This means it should be set specifically and preferably to a descriptive name to make it easier to identify which user should have access to each group. 


URL used to connect to the Avro schema storage. 


Health checks

We provide a health check topic you can use to test client configuration where random test data is published without a schema – see Testing for more information. 


Our Kafka cluster is setup with in-flight message encryption, using one-way SSL for all communication. For this reason, all clients must be able to trust the certificate from a Kafka broker to produce/consume the message. This requires some extra security configuration on the client side, as well as a trust store with the Entur CA (Certificate Authority) certificate. The client only needs to trust this CA. 

1. Generate trust store and import CA certificate:

keytool -import -keystore client.truststore.jks -alias entur-ca -file certs/entur-cacert.pem -storepass $TRUSTSTORE_PASSWORD -keypass $TRUSTSTORE_PASSWORD -noprompt

2. Client configuration:

sasl.mechanism=SCRAM-SHA-512 required \
     username="username" \

Example Java code with staging addresses (change to production from the list above to connect to the production environment):

Properties properties = new Properties();
properties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, ",,");
properties.put(ConsumerConfig.GROUP_ID_CONFIG, "group-id");
properties.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
properties.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);
properties.put("specific.avro.reader", true);
properties.put("schema.registry.url", "");

// Security
properties.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SASL_SSL");

properties.put(SaslConfigs.SASL_MECHANISM, "SCRAM-SHA-512");
properties.put(SaslConfigs.SASL_JAAS_CONFIG, String.format(" required\nusername=\"%s\"\npassword=\"%s\";", config.getSasl().get("username"), config.getSasl().get("password")));

properties.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, "/etc/ssl/kafka/client.truststore.jks");

producer = new KafkaConsumer<>(properties);

Before this client can start consuming from Kafka, the user credentials need to be registered. All access control is based on SASL (Simple Authentication and Security Layer) using SCRAM (Salted Challenge and Response Authentication Mechanism). Upon registration with our Kafka cluster, each user will be given a username and a password, which should not be shared to anyone else. This user will be given consumer rights to the needed topics for your organization and every consumer group ( ) used to read from these topics. By default, every resource in the cluster will deny everyone unless the provided credentials match the ACL for that resource.

Testing Client Configuration

Before setting up a connection to the schema registry and downloading the appropriate Avro schema etc., there is a health check topic where random test data is published. Every user registered on our Kafka platform should have access to consume messages from this topic, with a provided consumer group, as long as the necessary security configuration mentioned above is setup correctly. The topic can be used as a simple way to get started with Kafka and to test that client configuration meets the requirements without having to think about any business logic on the consumer side. Below is all the information you need to get started:

  1. topic name: healthcheck-staging
  2. consumer group: <username>-consumer-staging

Testing access to topics can also be done, without having to implement your own Kafka clients, by using a tool called kafkacat. Information about installing and using this tool can be found here:
An example command using all necessary security configurations for the staging cluster:,,
SASL_USERNAME=<your provided username>
kafkacat -C -b $BOOTSTRAP_SERVERS -t $TOPIC -X security.protocol=$SECURITY_PROTOCOL -o beginning -X sasl.mechanisms=$SASL_MECHANISMS -X -X sasl.username=$SASL_USERNAME -X sasl.password=$SASL_PASSWORD

⚠️Please note that records will be consumed with this tool, which means that you should NOT use the same consumer group as you plan to use in your actual application.

Topic Descriptions

Naming conventions

All data on the external cluster is segregated on topics which only 1 organization has access to. As such, your organization can only access data applicable to your organization id, and no other organization will be able to read this data. Each topic will have an organization id in the name that matches the id the data owner's organization. This id is appended to every topic on the cluster and drives the access rights to the topic.

All topic names (except the health check topic) follow the same convention: 

<content description>-<environment>-<organisationId>
Example: payments-staging-1






healthcheck-staging Health check topic 6 1 h
inventory-stock-invalidated-staging-<orgid> Invalidation-summary after stock-change to be read by the organization 9 7 d
inventory-stock-staging-<orgid> Updated stock result for a journey between origin and destination to be read by the organization 9 7 d
inventory-seat-reservations-staging-<orgid> All seat reservations performed in Seating to be read by the organization 9 7 d
payments-staging-<orgid> Completed payments with payment transactions for the organization 15 28 d
credits-staging-<orgid> Completed credits with credit transactions for the organization 15 28 d
sales-transaction-summary-staging-<orgid> Confirmed sales including all affected order line versions 
where the organization is either the owner of the product or the sales are completed in an organization-owned channel
15 28 d
given-consent-changed-staging-<orgid> All consents that have been changed by users from this organization. 9 compacted
customer-changed-staging-<orgid> All customers that have been changed for this organization. 9 compacted

Schema Registry

All schemas can be found using the REST API for the Schema Registry. This registry will always be versioned and up-to-date.

⚠️For security reasons, we only allow READ access to external clients. This means that HTTP GET methods are the only valid requests from the schema registry REST API documentation.

 Schema Registry UI - Staging
docker run --rm -p 8000:8000 -e "SCHEMAREGISTRY_URL=" -e "PROXY=true" landoop/schema-registry-ui


 Schema Registry UI - Production
docker run --rm -p 8000:8000 -e "SCHEMAREGISTRY_URL=" -e "PROXY=true" landoop/schema-registry-ui





No libraries exist yet. Watch this page...

The API reference(s) can be found here: