Have you ever thought of your data centers and cloud infrastructure ( private and public ) as one big computer? where you can deploy your applications with a click of a button, without worrying too much about the underlying infrastructure? well … DCOS allows you to manage your infrastructure from a single point, offering you the possibility to run distributed applications, containers, services, jobs while maintaining a certain abstraction from the infrastructure layer, as long as it provides computing, storage, and networking capabilities.
After deploying my ML model on a kubernates Cluster, a lambda function, I will deploy it on a DCOS cluster.
what is DCOS:
DCOS is a datacenter operating system, DC/OS is itself a distributed system, a cluster manager, a container platform, and an operating system.
DCOS manages the 3 layers of software, platform, and infrastructure.
the dashboard :
the catalog:
DCOS UI offers a catalog of certified and community packages that the users can install in seconds , like kafka, spark, hadoop, MySQL ..
Deploying Apps and ML models on DCOS :
the application I’m deploying is a web server running the model I created in my previous posts to make predictions.
DCOS relies on an application definition file that looks like this :
app.json :
[code]
{
“volumes”: null,
“id”: “mlpregv3”,
“cmd”: “python server.py”,
“instances”: 1,
“cpus”: 1,
“mem”: 128,
“disk”: 0,
“gpus”: 0,
“container”: {
“type”: “DOCKER”,
“docker”: {
“image”: “mbenachour/dcos-mlpreg:1”,
“forcePullImage”: false,
“privileged”: false,
“network”: “HOST”,
“portMappings”: [
{ “containerPort”: 8088, “hostPort”: 8088 }
]
}
}
}[/code]
the rest of the code can be found in my GitHub repo
after you configure your DCOS CLI and log in, you can run this command :
if we take a look at the UI we can see that app/web server has been deployed :