For discovering event sourcing
In the docker-compose.yml change the ip-adres to your host ipadres
Run the below commands in
- customer-consoleUI
- shopping-consoleUI
- readmodels-consoleUI (to be imlpemented)
$ dotnet restore
$ dotnet publish -c release -o publish
Build the docker-containers for the consumer and the producer:
$ docker build . -t coilz/web-store-customer
$ docker build . -t coilz/web-store-shopping
$ docker build . -t coilz/web-store-readmodels
From the root folder
$ docker-compose up -d
To stop the composition:
$ docker-compose down
Now execute in a different shell:
$ docker network ls
You will notice there is a new network created _default. In my case it is eventsourcingpoc_default.
$ docker run --rm --network="eventsourcingpoc_default" coilz/web-store-customer kafka my-topic
You can also add it to the compose file if you want
$ docker run --rm -it --network="eventsourcingpoc_default" coilz/web-store-shopping kafka my-topic
$ docker run --rm -it --network="eventsourcingpoc_default" coilz/web-store-readmodels kafka my-topic
- extract EventSourcingPoc.EventSourcing.Domain to EventSourcingPoc.Domain, so that process has a dependency on domain
- create a container with real storage
- Make sure the contexts all have read models
- create a readmodel for external use (BI)
- Create kafka consumers and producers for (cross-context) domain events
- create web api's for various contexts
- Add required properties for events