Is Docker a joke or do I just not see the point?
-
@dustinb3403 I don't think Docker itself is a joke, but I think a lot of Docker containers are a joke. One quick example is the OpenOffice container. The instructions for enabling https on it just don't work (or at least didn't a couple months ago when I tried setting it up last.) The only time I actually got it functioning with https, I had to add all the certificate files and configs in the docker container itself, not the host like the instructions lead you to believe.
Doh, not OpenOffice, Libre Office Online.
-
@travisdh1 But who want sot use openoffice anyway?
-
@jaredbusch said in Is Docker a joke or do I just not see the point?:
@travisdh1 But who want sot use openoffice anyway?
User fail. Should've been Libre Office Online.
-
It's definitely not a joke. It's useful for the purposes it was built for. It's hard to manage by itself, that's why you run an orchestration/service discovery with it. You define how your infrastructure is supposed to look in YAML and k8s will build it. A small example would be you've got an RDS instance running for NextCloud and some other services. You tell k8s that you want 3 replicas of the NextCloud server running and to expose on a certain port. K8s then does all of the service discovery and port mapping for you. If a node goes down, it brings up a pod on another node. On AWS and GCP it even automatically works with ELBs and whatever GCP calls their load balancer (I forget). So tomorrow you want to deploy the new updated version of NextCloud. Tell k8s to deploy the latest container version and only destroy after another is created, that's it.
It definitely has it's place. It's made for immutable, distributed services and systems. It does that job well.
-
You can deploy full VMs with Docker without performance hits because it’s just namespaces. Red Hat is deploying OpenStack (VMs) infrastructure with OpenShift (Docker and k8s). They say they want 8 (or whatever number) of Nova servers running and k8s deploys them.
-
Can you give an example relevant for a small business?
If i worked at Google or Amazon and was responsible for their infrastructure being available to the whole planet all the time i see the point. Why would any company with less than a few hundred million in revs or an equally large user count be interested? -
@momurda who is your question to me or @stacksofplates ?
-
@dustinb3403 Anybody with some insight. I, too, think it is useless.
-
@momurda said in Is Docker a joke or do I just not see the point?:
Can you give an example relevant for a small business?
If i worked at Google or Amazon and was responsible for their infrastructure being available to the whole planet all the time i see the point. Why would any company with less than a few hundred million in revs or an equally large user count be interested?Scale isn’t the only thing that matters. The whole purpose is to abstract away the underlying OS from your applications. You want to update an application but your OS doesn’t include the correct libs in their repos? Doesn’t matter. Include the libs in the container. Want to update your application in the middle of the day without affecting users, go ahead. Want to test a new deployment alongside the old one, just deploy with the new image.
-
@momurda said in Is Docker a joke or do I just not see the point?:
@dustinb3403 Anybody with some insight. I, too, think it is useless.
It's really not meant for the pure IT space. More for DevOps. It makes it very easy for programmers to deploy their latest version of the widget without IT being involved with the deployment.
-
@travisdh1 said in Is Docker a joke or do I just not see the point?:
@momurda said in Is Docker a joke or do I just not see the point?:
@dustinb3403 Anybody with some insight. I, too, think it is useless.
It's really not meant for the pure IT space. More for DevOps. It makes it very easy for programmers to deploy their latest version of the widget without IT being involved with the deployment.
I’d argue it’s for both. Who wants to deploy a whole machine for essentially an Apache server? And like I said earlier you can deploy whole VMs with it and can be assured the correct number is running all of the time.
-
LXD is where it's at.
-
@stacksofplates said in Is Docker a joke or do I just not see the point?:
@momurda said in Is Docker a joke or do I just not see the point?:
Can you give an example relevant for a small business?
If i worked at Google or Amazon and was responsible for their infrastructure being available to the whole planet all the time i see the point. Why would any company with less than a few hundred million in revs or an equally large user count be interested?Scale isn’t the only thing that matters. The whole purpose is to abstract away the underlying OS from your applications. You want to update an application but your OS doesn’t include the correct libs in their repos? Doesn’t matter. Include the libs in the container. Want to update your application in the middle of the day without affecting users, go ahead. Want to test a new deployment alongside the old one, just deploy with the new image.
OK, so scale isn't the critical component. Performance (cost) is a critical component. The argument is that you're making better use of your server. This I can understand.
What I'm failing to see is, at least in my experience almost every workload I have is different, different kernel, different OS, different focus.
And a traditional scaling system for either performance or uptime requirements fits well enough.
Docker very much feels like FreeNAS does. It's the Jurassic Park Effect in my eyes. DevOps can be given access to a hypervisor and have a system running in a matter of moments, just "because they don't want to bother IT" doesn't seem like a valid reason to need another solution that IT ends up supporting anyways.
-
@dustinb3403 said in Is Docker a joke or do I just not see the point?:
@stacksofplates said in Is Docker a joke or do I just not see the point?:
@momurda said in Is Docker a joke or do I just not see the point?:
Can you give an example relevant for a small business?
If i worked at Google or Amazon and was responsible for their infrastructure being available to the whole planet all the time i see the point. Why would any company with less than a few hundred million in revs or an equally large user count be interested?Scale isn’t the only thing that matters. The whole purpose is to abstract away the underlying OS from your applications. You want to update an application but your OS doesn’t include the correct libs in their repos? Doesn’t matter. Include the libs in the container. Want to update your application in the middle of the day without affecting users, go ahead. Want to test a new deployment alongside the old one, just deploy with the new image.
OK, so scale isn't the critical component. Performance (cost) is a critical component. The argument is that you're making better use of your server. This I can understand.
What I'm failing to see is, at least in my experience almost every workload I have is different, different kernel, different OS, different focus.
And a traditional scaling system for either performance or uptime requirements fits well enough.
Docker very much feels like FreeNAS does. It's the Jurassic Park Effect in my eyes. DevOps can be given access to a hypervisor and have a system running in a matter of moments, just "because they don't want to bother IT" doesn't seem like a valid reason to need another solution that IT ends up supporting anyways.
How many different OSs do you have? Windows and Linux and something else?
It’s obviously geared towards Linux.
DevOps can be given access to a hypervisor and have a system running in a matter of moments
This isn’t always true and sometimes isn’t allowed for compliance reasons.
And again, you don’t have to be developing anything to use Docker. You can leverage the images that companies put out to run their software (like UNMS). At that point updates are completely separate from the OS.
I’ve been playing with Fedora Atomic Workstation. The base OS is built from rpm-ostree and all of the packages are some type of container (docker, flatpack, etc). The OS doesn’t have anything actually installed. You can pull in the new kernel image and not affect any applications at all. You can even rebase to another OS and still have your applications without change.
It’s all about abstraction, just like with any virtualization.
-
@stacksofplates you've change the purpose several times already, from performance, to scalability, to now abstraction.
You're not making a great argument to change my mind.
-
@stacksofplates None of the applications you install are managed by OS? How is this different and better than installing applications to %appdata% in Windows?
You must update docker images separately?
The docker images may or may not get updated with package versions available to OS?
What do you think about things like bitnami? -
@dustinb3403 said in Is Docker a joke or do I just not see the point?:
@stacksofplates you've change the purpose several times already, from performance, to scalability, to now abstraction.
You're not making a great argument to change my mind.
Ha ok. First I stated it’s for immutable and distributed systems which means abstraction. Then I stated it’s not for abstraction. Then i stated again it’s for abstraction.
I never said it’s for scale or performance ever I literally just not mentioned points of abstraction.
Edit: because I turned off auto correct and auto capitalization on my iPhone and I’m struggling with it.
-
Maybe containers is a solution for this, maybe not...
A set of in-house designed applications that require a bunch of specific pre-requisites, such as specific .net versions and other specifics on a Windows OS. This application needs to work on distributed systems around the world, on various configurations of Windows. Having the app in a deployable container that contains all of the specific requirements would do the trick.
Is that a legitimate use of Containers? Which would work in that Windows environment? (on Windows 10 systems)
-
@momurda said in Is Docker a joke or do I just not see the point?:
@stacksofplates None of the applications you install are managed by OS? How is this different and better than installing applications to %appdata% in Windows?
You must update docker images separately?
The docker images may or may not get updated with package versions available to OS?
What do you think about things like bitnami?I’m assuming you mean on Atomic Workstation? You can “install” certain packages but they aren’t installed in a traditional sense. It’s an overlay file system overtop of the operating system. Docker images are deployed separately that’s correct.
As far as I know, Bitnami just uses debs and rpms to deploy with.
-
@tim_g said in Is Docker a joke or do I just not see the point?:
Maybe containers is a solution for this, maybe not...
A set of in-house designed applications that require a bunch of specific pre-requisites, such as specific .net versions and other specifics on a Windows OS. This application needs to work on distributed systems around the world, on various configurations of Windows. Having the app in a deployable container that contains all of the specific requirements would do the trick.
Is that a legitimate use of Containers? Which would work in that Windows environment? (on Windows 10 systems)
Ya that’s a good use case. You can deploy to any version of an OS (that you can get Docker on obviously) because all of the dependencies are in the container. It’s abstracted away from the base OS.
Another cool use case is for immutable deployments that are constantly rebuilt. You can do it with full cloud images but it’s faster with containers (this also works with LXC/LXD). There are companies that are destroying and rebuilding their whole infrastructure repeatedly (some every hour) and that way you never systems that have long uptimes.