Skip to content

anurag03/aiops-data-collector

 
 

Repository files navigation

AI-Ops data collector

Fetch data from S3 bucket and pass it to AI service

OpenShift Deployment

Please consult aiops-deploy repository for deployment scheme.

Local build

If you would like to deploy the clustering service locally, you can build the container using S2I

❯ s2i build -c . centos/python-36-centos7 aiops-data-collector

For convenience you can store your desired environment variables in a separate file

❯ cat << EOT >> env.list
AI_MICROSERVICE_URL=<AI_SERVICE_HTTP_ENDPOINT>
EOT

And then run it as a Docker container

❯ docker run --env-file env.list -it aiops-data-collector

Push to OpenShift repository

Some use cases requires the image to be built locally and then just simply deployed as is. Docker and OpenShift can facilitate this process:

❯ docker build -t <CLUSTER_REGISTRY>/<PROJECT>/aiops-data-collector:latest .
❯ docker login -u <USERNAME> -p $(oc whoami -t) https://<CLUSTER_REGISTRY>
❯ docker push <CLUSTER_REGISTRY>/<PROJECT>/aiops-data-collector:latest
  • <CLUSTER_REGISTRY> is an domain name (e.g. registry.insights-dev.openshift.com)
  • <PROJECT> means the OpenShift project
  • <USERNAME> means OpenShift login

Related projects

About

AI-Ops: Fetch S3 data microservice

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 74.6%
  • Dockerfile 13.9%
  • Groovy 9.6%
  • Shell 1.9%