SAP Cloud Application Programming Model Getting Started (CAP): Unterschied zwischen den Versionen

Aus MattWiki
Keine Bearbeitungszusammenfassung
 
(36 dazwischenliegende Versionen desselben Benutzers werden nicht angezeigt)
Zeile 1: Zeile 1:
This article describes how to build Cloud Application Programming Model projects (CAP projects) and according artefacts with Core Data Services (CDS).
This article describes how to get started with building Cloud Application Programming Model (CAP) projects and according artefacts with Core Data Services (CDS).
 
Source: ABAP Freak Show Ep. 1 - HANA Cloud and the Business Application Studio
 
https://www.youtube.com/watch?v=a3WPQwmpbvI&list=PLoc6uc3ML1JR38-V46qhmGIKG07nXwO6X&index=71


For local native development see [[SAP HANA Local Native Development]]
For local native development see [[SAP HANA Local Native Development]]


All-in-one Quick Start: https://github.com/SAP-samples/cloud-cap-walkthroughs/blob/master/exercises-node/intro/README.md
== Sources and Further Reading ==
 
{| class="wikitable"
More Repositories: https://github.com/SAP-samples/cloud-cap-samples --> Use opensap*-Branches
|+
!Description
!URL
!Comment
|-
|All-in-one Quick Start
|https://github.com/SAP-samples/cloud-cap-walkthroughs/blob/master/exercises-node/intro/README.md
|
|-
|Back to basics SAP Cloud Application Programming Model (CAP)
|https://www.youtube.com/playlist?list=PL6RpkC85SLQBHPdfHQ0Ry2TMdsT-muECx<nowiki/>https://github.com/qmacro/capb2b
|SAP Developers YouTube Playlist and GitHub Repository
|-
|Qmacro / DJ Adams
|https://qmacro.org/tags/cap/
|
|-
|Qmacro @ SAP Community
|https://community.sap.com/t5/user/viewprofilepage/user-id/53
|
|-
|SAP CAP Documentation
|https://cap.cloud.sap/
|
|-
|SAP CAP Sample Repositories
|https://github.com/SAP-samples/cloud-cap-samples
|Use opensap*-Branches
|-
|GitHub Repositories of CAP and Fiori Showcases
|https://github.com/SAP-samples/cap-sflight<nowiki/>https://github.com/SAP-samples/fiori-elements-feature-showcase
|
|-
|ABAP Freak Show Ep. 1 - HANA Cloud and the Business Application Studio
|https://www.youtube.com/watch?v=a3WPQwmpbvI&list=PLoc6uc3ML1JR38-V46qhmGIKG07nXwO6X&index=71
|
|}


== Installation of Prerequisites==
== Installation of Prerequisites==
Zeile 18: Zeile 50:
  npm i -g @sap/cds-dk
  npm i -g @sap/cds-dk


'''Additional 3rd Party Tools'''1
'''Additional 3rd Party Tools'''


  npm i -g hana-cli
  npm i -g hana-cli


== Create CDS Project ==
== Setup of CAP Projects ==


=== Create CDS Project ===
  cds init bookshop
  cds init bookshop


Zeile 30: Zeile 63:
  cds init MyCDSProject --add hana, mta
  cds init MyCDSProject --add hana, mta


== Start Service ==
Download and install dependencies:


npm install
=== Start Service ===
  cds watch
  cds watch
cds w


This also monitors changes to the underlying files and restarts when a file has changed.
This also monitors changes to the underlying files and restarts when a file has changed.


=== Prevent Reloading ===
To prevent reloading the service, for example when saving temporary files in the project directory, save the files in one of the directoryies which are excluded from monitoring for file changes, which are:
* _out
* node_modules
* @types
* @cds-models
For more details see: https://qmacro.org/blog/posts/2024/04/10/avoid-design-time-cap-server-restarts-when-maintaining-local-data-files/
=== Add Persistency to CDS ===
In case of SQLite first install SQLite3 packages by executing in the root of the project folder:
npm i sqlite3 -D
Deploy data model to different database types:
cds deploy
This deploys the cds entity models and csv files to the database specified in package.json in section ''cds.requires.db''. This should look like this:<syntaxhighlight lang="json" line="1">
{ "cds":
    { "requires": {
      "db": {
          "kind": "sqlite",
        "credentials": { "url": "db.sqlite" }
            }
        }
    }
}
</syntaxhighlight>For more explainations see for SQLite see https://cap.cloud.sap/docs/guides/databases-sqlite
cds deploy --to sqlite          # Deploys to sqlite.db
cds deploy --to sqlite:my.db    # Deploys to my.db
cds deploy --to hana            # Deploys to HDI container on HANA
                                # (Requires Cloud Foundry login)   
This does not update the ''package.json'' secion ''cds.requires.db'' any more as opposed to older cds versions.
In order to see the actual created SQL statements:
cds compile srv/cat-service.cds --to sql    # Creates SQL statements
cds compile srv/cat-service.cds --to hana    # Creates hdbcds or hdbtable/hdbview artefacts
When deployed to SQLite database use this statement to view the database:
sqlite3 my.db -cmd .dump
Create CSV header files in db location (by default db/data) for modeled entities:
cds add data
== Deployment to SAP Business Technology Platform (BTP) ==
Login to CF space:
cf login
=== Build Deployment Package ===
cds add hana
''package.json'' needs to contain ''"kind": "hana"''. (really? - for which CF / HANA Cloud Version?)
Also when deploying to BTP which is internally HANA 4.0 ''"deploy-format": "hdbtable"'' has to be set and hdi-deploy should be at least version 4. Example:
<syntaxhighlight lang="JSON">
{
    ...
    "devDependencies": {
        "@sap/hdi-deploy": "^4"
    },
    ...
    "cds": {
        "hana": {
            "deploy-format": "hdbtable"
        },
        "requires": {
            "db": {
                "model": [
                    "db",
                    "srv"
                ],
                "kind": "hana"
            }
        }
    }
}
</syntaxhighlight>
=== Deploy via Cloud Foundry CLI ===
Build the deployment artifacts:
cds build/all
cds build --production        # Only production profile?
Create a hana service for the hdi-container. The stated hdi-container name must correlate to the services-name in ''/gen/db/src/manifest.yaml''.
cf create-service hana hdi-shared <app-db-hdi-container>
Now push the DB deployer application and also the actual service application:
cf push -f gen/db
cf push -f gen/srv --random-route
=== Deploy via MTA ===
Install MTA Build Tool
npm install -g mbt
Generate MTA project descriptor file ''mta.yaml'' and build MTA archive:
cds add mta
mbt build
Deploy MTA archive to CF space:
cf deploy mta_archives/<buildmta>.mtar
=== Troubleshooting ===
Real-time output of CF services can be displayed in BAS terminal with
cf logs <appname>
== CDS Commands Cheatsheet ==
=== Noteworthy Commands ===
{| class="wikitable"
|+
!Command
!Description
|-
|cds env
|Displays the effective configuration...
|-
|cds env ls
|... in .properties format
|-
|cds compile services.cds
|Compile JSON-like database structure
|-
|<nowiki>cds compile services.cds | jq</nowiki>
|dto, but in actual JSON
|-
|cds compile services.cds --to sql
|Compile SQL statements from CDS
|-
|cds compile services.cds --to edmx
|Compile EDMX file (metadata) from CDS
|-
|cds compile services.cds --to csn
|Output CSN = Internal CDS Schema Notation...
|-
|<nowiki>cds compile services.cds --to csn | jq '.definitions | keys'</nowiki>
|... and process output with JSON processor by extracting keys of definition nodes
|-
|cds compile schema.cds --to yaml
|Output Yaml of database schema
|-
|cds compile services.cds --to yaml
|Output Yaml of all services including their db schema
|-
|cds add data
|Create CSV header files in db location (by default db/data) for modeled entities
|}
=== Using CDS REPL & JavaScript APIs ===
cds r
cds repl
This launches an read-eval-print-loop which can be used as an interactive playground to experiment with CDS' JavaScript APIs.
{| class="wikitable"
|+
!API
!Description
|-
|cds.utils.uuid()
|Returns an UUID
|}
== Accessing Services from Command Line ==
=== Reading from Services ===
Read output of services via ''curl'' and pipe it through ''jq'' for nicer formatting:
curl -s 'localhost:4004/odata/v4/bookshop/Orders' | jq
Output (example for Bookshop repo):<syntaxhighlight lang="json" line="1">
{
  "@odata.context": "$metadata#Orders",
  "value": [
    {
      "ID": "6091d4ab-650e-4afa-90bb-163305e473a2",
      "comment": "second order"
    },
    {
      "ID": "ac5aeb9f-c7cd-4f52-ab4a-9c0313ded402",
      "comment": "first order"
    }
  ]
}
</syntaxhighlight>
=== Deep Insert into Service with POST ===
For performing a deep insert we first need a at least two level deep structure.
The following ''neworder.json'' file with a two-level deep structure is given:<syntaxhighlight lang="json" line="1">
{
    "comment": "New Order",
    "Items": [
            {
                "pos": 1,
                "quantity": 10
            },
            {
                "pos": 2,
                "quantity": 20
            }
        ]
}
</syntaxhighlight>To perform the deep insert put the json file into the service by utilizing ''curl''. The filename needs to be prefixed with a @. As we are connecting to a web service we also need to add a header and specify the content type, in this case ''application/json'':
curl --verbose --data @neworder.json --header 'content-type: application/json' --url 'localhost:4004/odata/v4/bookshop/Orders'
By default ''curl'' with parameter ''--data'' uses the POST method. For more details on the HTTP methods see [[Notable RFCs#HTTP Methods GET, POST, PUT, DELETE|Notable RFCs Section HTTP Methods GET, POST, PUT, DELETE]]
We should receive an answer that is similar to the following one in that we are connected to localhost, and we posted to odata service and that the HTTP code was ''201 Created:''<syntaxhighlight lang="http">
*  Trying 127.0.0.1:4004...
* Connected to localhost (127.0.0.1) port 4004 (#0)
> POST /odata/v4/bookshop/Orders HTTP/1.1
> Host: localhost:4004
> User-Agent: curl/7.74.0
> Accept: */*
> content-type: application/json
> Content-Length: 193
>
* upload completely sent off: 193 out of 193 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 201 Created
< X-Powered-By: Express
< X-Correlation-ID: 4eee1d5a-5b43-4577-8257-a99b026a0c43
< OData-Version: 4.0
< content-type: application/json;odata.metadata=minimal
< Location: Orders(9416b371-2355-402a-b06d-34f5b2c61704)
< Date: Sat, 20 Apr 2024 12:23:52 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Content-Length: 280
<
* Connection #0 to host localhost left intact
{"@odata.context":"$metadata#Orders(Items())/$entity","ID":"9416b371-2355-402a-b06d-34f5b2c61704","comment":"New Order","Items":[{"parent_ID":"9416b371-2355-402a-b06d-34f5b2c61704","pos":1,"quantity":10},{"parent_ID":"9416b371-2355-402a-b06d-34f5b2c61704","pos":2,"quantity":20}]}
</syntaxhighlight>
=== Update Records via Service with PATCH ===
Following up the deep insert example above let's assume we want to update the Order with comment ''New Order''. For this we create a ''neworder-update.json'' with a different content, i.e.:<syntaxhighlight lang="json" line="1">
{
  "comment": "New Order",
  "Items": [
          {
              "pos": 1,
              "quantity": 1000
          }
      ]
}
</syntaxhighlight>To perform the update we need to refer to the ID of the previously created Order to have this one updated. This happens by using the ID ''9416b371-2355-402a-b06d-34f5b2c61704'' from the server response above and adding it to the url after ''Order'' in parentheses:
curl -X PATCH --verbose --data @neworder-update.json --header 'content-type: application/json' --url 'localhost:4004/odata/v4/bookshop/Orders(9416b371-2355-402a-b06d-34f5b2c61704)' | jq .
The expected result is HTTP code ''200 OK'':<syntaxhighlight lang="http">
  % Total    % Received % Xferd  Average Speed  Time    Time    Time  Current
                                Dload  Upload  Total  Spent    Left  Speed
  0    0    0    0    0    0      0      0 --:--:-- --:--:-- --:--:--    0*  Trying 127.0.0.1:4004...
* Connected to localhost (127.0.0.1) port 4004 (#0)
> PATCH /odata/v4/bookshop/Orders(9416b371-2355-402a-b06d-34f5b2c61704) HTTP/1.1
> Host: localhost:4004
> User-Agent: curl/7.74.0
> Accept: */*
> content-type: application/json
> Content-Length: 121
>
} [121 bytes data]
* upload completely sent off: 121 out of 121 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< X-Powered-By: Express
< X-Correlation-ID: 9a21bb8f-cfcc-4641-bc68-2f1bcbb7d5a8
< OData-Version: 4.0
< content-type: application/json;odata.metadata=minimal
< Date: Sat, 20 Apr 2024 12:34:13 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Content-Length: 198
<
{ [198 bytes data]
100  319  100  198  100  121  19800  12100 --:--:-- --:--:-- --:--:-- 31900
* Connection #0 to host localhost left intact
{
  "@odata.context": "$metadata#Orders/$entity",
  "ID": "9416b371-2355-402a-b06d-34f5b2c61704",
  "comment": "New Order",
  "Items": [
    {
      "parent_ID": "9416b371-2355-402a-b06d-34f5b2c61704",
      "pos": 1,
      "quantity": 1000
    }
  ]
}
</syntaxhighlight>Check the results with queueing the service again:
curl -s 'localhost:4004/odata/v4/bookshop/Orders?$expand=Items' | jq
<syntaxhighlight lang="json">
{
  "@odata.context": "$metadata#Orders(Items())",
  "value": [
    {
      ...
    },
    {
      "ID": "9416b371-2355-402a-b06d-34f5b2c61704",
      "comment": "New Order",
      "Items": [
        {
          "parent_ID": "9416b371-2355-402a-b06d-34f5b2c61704",
          "pos": 1,
          "quantity": 1000
        }
      ]
    },
    {
      ...
    }
  ]
}
</syntaxhighlight>
[[Category:SAP]]
[[Category:SAP]]
[[Category:HANA]]
[[Category:HANA]]
[[Kategorie:JavaScript]]

Aktuelle Version vom 20. April 2024, 16:23 Uhr

This article describes how to get started with building Cloud Application Programming Model (CAP) projects and according artefacts with Core Data Services (CDS).

For local native development see SAP HANA Local Native Development

Sources and Further Reading

Description URL Comment
All-in-one Quick Start https://github.com/SAP-samples/cloud-cap-walkthroughs/blob/master/exercises-node/intro/README.md
Back to basics SAP Cloud Application Programming Model (CAP) https://www.youtube.com/playlist?list=PL6RpkC85SLQBHPdfHQ0Ry2TMdsT-muECxhttps://github.com/qmacro/capb2b SAP Developers YouTube Playlist and GitHub Repository
Qmacro / DJ Adams https://qmacro.org/tags/cap/
Qmacro @ SAP Community https://community.sap.com/t5/user/viewprofilepage/user-id/53
SAP CAP Documentation https://cap.cloud.sap/
SAP CAP Sample Repositories https://github.com/SAP-samples/cloud-cap-samples Use opensap*-Branches
GitHub Repositories of CAP and Fiori Showcases https://github.com/SAP-samples/cap-sflighthttps://github.com/SAP-samples/fiori-elements-feature-showcase
ABAP Freak Show Ep. 1 - HANA Cloud and the Business Application Studio https://www.youtube.com/watch?v=a3WPQwmpbvI&list=PLoc6uc3ML1JR38-V46qhmGIKG07nXwO6X&index=71

Installation of Prerequisites

Relevant Tools from SAP

npm i -g @sap/cds
npm i -g @sap/cds-dk

Additional 3rd Party Tools

npm i -g hana-cli

Setup of CAP Projects

Create CDS Project

cds init bookshop

or

cds init MyCDSProject --add hana, mta

Download and install dependencies:

npm install

Start Service

cds watch
cds w

This also monitors changes to the underlying files and restarts when a file has changed.

Prevent Reloading

To prevent reloading the service, for example when saving temporary files in the project directory, save the files in one of the directoryies which are excluded from monitoring for file changes, which are:

  • _out
  • node_modules
  • @types
  • @cds-models

For more details see: https://qmacro.org/blog/posts/2024/04/10/avoid-design-time-cap-server-restarts-when-maintaining-local-data-files/

Add Persistency to CDS

In case of SQLite first install SQLite3 packages by executing in the root of the project folder:

npm i sqlite3 -D

Deploy data model to different database types:

cds deploy

This deploys the cds entity models and csv files to the database specified in package.json in section cds.requires.db. This should look like this:

{ "cds": 
    { "requires": {
       "db": {
          "kind": "sqlite",
        "credentials": { "url": "db.sqlite" } 
            }
        }
    }
}

For more explainations see for SQLite see https://cap.cloud.sap/docs/guides/databases-sqlite

cds deploy --to sqlite          # Deploys to sqlite.db
cds deploy --to sqlite:my.db    # Deploys to my.db
cds deploy --to hana            # Deploys to HDI container on HANA
                                # (Requires Cloud Foundry login)    

This does not update the package.json secion cds.requires.db any more as opposed to older cds versions.

In order to see the actual created SQL statements:

cds compile srv/cat-service.cds --to sql     # Creates SQL statements
cds compile srv/cat-service.cds --to hana    # Creates hdbcds or hdbtable/hdbview artefacts

When deployed to SQLite database use this statement to view the database:

sqlite3 my.db -cmd .dump

Create CSV header files in db location (by default db/data) for modeled entities:

cds add data

Deployment to SAP Business Technology Platform (BTP)

Login to CF space:

cf login

Build Deployment Package

cds add hana

package.json needs to contain "kind": "hana". (really? - for which CF / HANA Cloud Version?)

Also when deploying to BTP which is internally HANA 4.0 "deploy-format": "hdbtable" has to be set and hdi-deploy should be at least version 4. Example:

{
    ...
    "devDependencies": {
        "@sap/hdi-deploy": "^4"
    },
    ...
    "cds": {
        "hana": {
            "deploy-format": "hdbtable"
        },
        "requires": {
            "db": {
                "model": [
                    "db",
                    "srv"
                ],
                "kind": "hana"
            }
        }
    }
}

Deploy via Cloud Foundry CLI

Build the deployment artifacts:

cds build/all
cds build --production         # Only production profile?

Create a hana service for the hdi-container. The stated hdi-container name must correlate to the services-name in /gen/db/src/manifest.yaml.

cf create-service hana hdi-shared <app-db-hdi-container>

Now push the DB deployer application and also the actual service application:

cf push -f gen/db
cf push -f gen/srv --random-route

Deploy via MTA

Install MTA Build Tool

npm install -g mbt

Generate MTA project descriptor file mta.yaml and build MTA archive:

cds add mta
mbt build

Deploy MTA archive to CF space:

cf deploy mta_archives/<buildmta>.mtar

Troubleshooting

Real-time output of CF services can be displayed in BAS terminal with

cf logs <appname>

CDS Commands Cheatsheet

Noteworthy Commands

Command Description
cds env Displays the effective configuration...
cds env ls ... in .properties format
cds compile services.cds Compile JSON-like database structure
cds compile services.cds | jq dto, but in actual JSON
cds compile services.cds --to sql Compile SQL statements from CDS
cds compile services.cds --to edmx Compile EDMX file (metadata) from CDS
cds compile services.cds --to csn Output CSN = Internal CDS Schema Notation...
cds compile services.cds --to csn | jq '.definitions | keys' ... and process output with JSON processor by extracting keys of definition nodes
cds compile schema.cds --to yaml Output Yaml of database schema
cds compile services.cds --to yaml Output Yaml of all services including their db schema
cds add data Create CSV header files in db location (by default db/data) for modeled entities

Using CDS REPL & JavaScript APIs

cds r
cds repl

This launches an read-eval-print-loop which can be used as an interactive playground to experiment with CDS' JavaScript APIs.

API Description
cds.utils.uuid() Returns an UUID

Accessing Services from Command Line

Reading from Services

Read output of services via curl and pipe it through jq for nicer formatting:

curl -s 'localhost:4004/odata/v4/bookshop/Orders' | jq

Output (example for Bookshop repo):

{
  "@odata.context": "$metadata#Orders",
  "value": [
    {
      "ID": "6091d4ab-650e-4afa-90bb-163305e473a2",
      "comment": "second order"
    },
    {
      "ID": "ac5aeb9f-c7cd-4f52-ab4a-9c0313ded402",
      "comment": "first order"
    }
  ]
}

Deep Insert into Service with POST

For performing a deep insert we first need a at least two level deep structure.

The following neworder.json file with a two-level deep structure is given:

{
    "comment": "New Order",
    "Items": [
            {
                "pos": 1,
                "quantity": 10
            },
            {
                "pos": 2,
                "quantity": 20
            }
        ]
}

To perform the deep insert put the json file into the service by utilizing curl. The filename needs to be prefixed with a @. As we are connecting to a web service we also need to add a header and specify the content type, in this case application/json:

curl --verbose --data @neworder.json --header 'content-type: application/json' --url 'localhost:4004/odata/v4/bookshop/Orders'

By default curl with parameter --data uses the POST method. For more details on the HTTP methods see Notable RFCs Section HTTP Methods GET, POST, PUT, DELETE

We should receive an answer that is similar to the following one in that we are connected to localhost, and we posted to odata service and that the HTTP code was 201 Created:

*   Trying 127.0.0.1:4004...
* Connected to localhost (127.0.0.1) port 4004 (#0)
> POST /odata/v4/bookshop/Orders HTTP/1.1
> Host: localhost:4004
> User-Agent: curl/7.74.0
> Accept: */*
> content-type: application/json
> Content-Length: 193
> 
* upload completely sent off: 193 out of 193 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 201 Created
< X-Powered-By: Express
< X-Correlation-ID: 4eee1d5a-5b43-4577-8257-a99b026a0c43
< OData-Version: 4.0
< content-type: application/json;odata.metadata=minimal
< Location: Orders(9416b371-2355-402a-b06d-34f5b2c61704)
< Date: Sat, 20 Apr 2024 12:23:52 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Content-Length: 280
< 
* Connection #0 to host localhost left intact
{"@odata.context":"$metadata#Orders(Items())/$entity","ID":"9416b371-2355-402a-b06d-34f5b2c61704","comment":"New Order","Items":[{"parent_ID":"9416b371-2355-402a-b06d-34f5b2c61704","pos":1,"quantity":10},{"parent_ID":"9416b371-2355-402a-b06d-34f5b2c61704","pos":2,"quantity":20}]}

Update Records via Service with PATCH

Following up the deep insert example above let's assume we want to update the Order with comment New Order. For this we create a neworder-update.json with a different content, i.e.:

{
  "comment": "New Order",
  "Items": [
          {
              "pos": 1,
              "quantity": 1000
          }
      ]
}

To perform the update we need to refer to the ID of the previously created Order to have this one updated. This happens by using the ID 9416b371-2355-402a-b06d-34f5b2c61704 from the server response above and adding it to the url after Order in parentheses:

curl -X PATCH --verbose --data @neworder-update.json --header 'content-type: application/json' --url 'localhost:4004/odata/v4/bookshop/Orders(9416b371-2355-402a-b06d-34f5b2c61704)' | jq .

The expected result is HTTP code 200 OK:

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0*   Trying 127.0.0.1:4004...
* Connected to localhost (127.0.0.1) port 4004 (#0)
> PATCH /odata/v4/bookshop/Orders(9416b371-2355-402a-b06d-34f5b2c61704) HTTP/1.1
> Host: localhost:4004
> User-Agent: curl/7.74.0
> Accept: */*
> content-type: application/json
> Content-Length: 121
> 
} [121 bytes data]
* upload completely sent off: 121 out of 121 bytes
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< X-Powered-By: Express
< X-Correlation-ID: 9a21bb8f-cfcc-4641-bc68-2f1bcbb7d5a8
< OData-Version: 4.0
< content-type: application/json;odata.metadata=minimal
< Date: Sat, 20 Apr 2024 12:34:13 GMT
< Connection: keep-alive
< Keep-Alive: timeout=5
< Content-Length: 198
< 
{ [198 bytes data]
100   319  100   198  100   121  19800  12100 --:--:-- --:--:-- --:--:-- 31900
* Connection #0 to host localhost left intact
{
  "@odata.context": "$metadata#Orders/$entity",
  "ID": "9416b371-2355-402a-b06d-34f5b2c61704",
  "comment": "New Order",
  "Items": [
    {
      "parent_ID": "9416b371-2355-402a-b06d-34f5b2c61704",
      "pos": 1,
      "quantity": 1000
    }
  ]
}

Check the results with queueing the service again:

curl -s 'localhost:4004/odata/v4/bookshop/Orders?$expand=Items' | jq
{
  "@odata.context": "$metadata#Orders(Items())",
  "value": [
    {
      ...
    },
    {
      "ID": "9416b371-2355-402a-b06d-34f5b2c61704",
      "comment": "New Order",
      "Items": [
        {
          "parent_ID": "9416b371-2355-402a-b06d-34f5b2c61704",
          "pos": 1,
          "quantity": 1000
        }
      ]
    },
    {
      ...
    }
  ]
}