Bloody Linux! Just install the program/software
-
@Dashrender said:
How does the installer deal with third party components? The dev tells the installer where to find them online, the installer then goes and downloads them and installs them.
No, the devs and the installers never get to make these decisions, that's what makes it work so well. It is managed repositories of code. This is where the safety comes in. The repos are controlled so no matter what reckless package request someone puts into an installer package, it only gets installed if you trust the source. This is key to a reliable system.
-
@scottalanmiller said:
If you were doing this "this Linux way" you just "yum install snipeit" and done.
Real men compile from source.
-
@scottalanmiller said:
I think that it is often overlooked that "hard" Linux installs often involve doing a huge amount of work that in Windows is views as an unrelated task (downloading and installing the platform, database, etc.)
Because it's assumed that you already have the stuff available. Is it any different than running all kinds of updates to get a newer version of PHP, MySQL, or anything else on Linux? Yeah, you can script it, but that's about it. You still have to do all those steps on any *nix.
When I deploy something other than standard .NET code on a Windows install, I make sure I have the right binaries in my library ready to go. And this is definitely outside of .NET. We don't even support the use of PHP on IIS. It's always a bad idea to use it.
Linux people think Windows is so complicated and hard when it's only them making it out to be. Then they deride us Windows admins because we only "click" and don't know anything.
-
@PSX_Defector said:
Linux people think Windows is so complicated and hard when it's only them making it out to be. Then they deride us Windows admins because we only "click" and don't know anything.
No, it definitely involves asking Windows admins how they make it easy and they answer that they don't and actually do all of this stuff by hand all of the time.
-
@PSX_Defector said:
Because it's assumed that you already have the stuff available. Is it any different than running all kinds of updates to get a newer version of PHP, MySQL, or anything else on Linux? Yeah, you can script it, but that's about it. You still have to do all those steps on any *nix.
But you don't do that stuff on Linux, you see. That's what we've been pointing out. It's all handled by the OS itself, both the install, determining it as dependencies and handling versioning and patching. What you are describing is actually the misconceptions that Windows admins seem to always have - that all of this work that you are doing on Windows has to be done on Linux.
It's actually that much easier on Linux!
-
@scottalanmiller said:
Here is the full install on Linux: yum -y install epel-release; mkdir -p /var/www/html; cd /var/www/html/; wget https://raw.githubusercontent.com/snipe/snipe-it/master/install.sh && chmod 744 install.sh && ./install.sh && cd snipeit; sed -i "s/'timezone' => '',/'timezone' => 'UTC',/" app/config/app.php; php artisan app:install
That's what I tried and got page not found, going to try again today when I finished a few other things
-
This is on a fresh CentOS 7 server install? I did this three times from a Digital Ocean CentOS 7 starter image and it came right up.
Check the status of your SELinux and your firewall.
-
@scottalanmiller said:
This is on a fresh CentOS 7 server install? I did this three times from a Digital Ocean CentOS 7 starter image and it came right up.
Check the status of your SELinux and your firewall.
DO probably includes
wget
-
I guess this goes toward how engineering will create the bad image that is used by the company.... But be different from a normal off the shelf install.
-
@Dashrender said:
I guess this goes toward how engineering will create the bad image that is used by the company.... But be different from a normal off the shelf install.
Sort of. But having wget is generally nice and not generally considered a security concern (it's just extra package that replicates what curl does.) Having wget isn't bad, but you also don't want your OS coming with extra stuff just for fun. You want, at least in a server image, everything as lean and clean as you can possibly get it. But as an optional install, nothing wrong with internal engineering including it. This would be "external engineering" being a bit of a pain.
I think that Vultr actually includes the EPEL by default, which while I like that personally, it's overall pretty obnoxious (although to be clear, including the EPEL does not imply that anything from the EPEL has been installed.)
-
@scottalanmiller said:
@Dashrender said:
I guess this goes toward how engineering will create the bad image that is used by the company.... But be different from a normal off the shelf install.
Sort of. But having wget is generally nice and not generally considered a security concern (it's just extra package that replicates what curl does.) Having wget isn't bad, but you also don't want your OS coming with extra stuff just for fun. You want, at least in a server image, everything as lean and clean as you can possibly get it. But as an optional install, nothing wrong with internal engineering including it. This would be "external engineering" being a bit of a pain.
I think that Vultr actually includes the EPEL by default, which while I like that personally, it's overall pretty obnoxious (although to be clear, including the EPEL does not imply that anything from the EPEL has been installed.)
Exactly. None of these things are bad, just that by including them it leads people to expecting certain things.
I see so many setup guides for things that assume
epel-release
or assumewget
or some some such thing. -
I definitely see wget just assumed in tons of guides.
-
@scottalanmiller said:
@PSX_Defector said:
Because it's assumed that you already have the stuff available. Is it any different than running all kinds of updates to get a newer version of PHP, MySQL, or anything else on Linux? Yeah, you can script it, but that's about it. You still have to do all those steps on any *nix.
But you don't do that stuff on Linux, you see. That's what we've been pointing out. It's all handled by the OS itself, both the install, determining it as dependencies and handling versioning and patching. What you are describing is actually the misconceptions that Windows admins seem to always have - that all of this work that you are doing on Windows has to be done on Linux.
So when I compile from source a kernel install, it's just magically gonna install, configure, secure and harden third party applications for me? Yeah, I don't think so.
What you are talking about is outside of "Linux" and is done at the distro level. Hence why Ubuntu default installs garbage and DSL does not. RPM/apt/YaST, all of that is distro specific which controls that, it's not the Linux kernel doing it. And it's exactly the way that Windows handles it as well.
Just because your binary package management tool takes care of all of that doesn't mean that it's not being done. Dependencies and various other stuff for third party applications are perfectly built into the MSI installer process. It knows how to check and what to get, as long as the installer is setup to do it. Just because some folks don't make that happen within their installers doesn't mean that Linux is superior, because I've known plenty of RPMs that are not worth a damn in repositories. As we see here, crap installers are crap installers.
-
@PSX_Defector said:
So when I compile from source a kernel install, it's just magically gonna install, configure, secure and harden third party applications for me? Yeah, I don't think so.
See, this is how silly you have to get to come up with examples. No production environment should be compiling custom kernels. That's SO silly. Is this really what Windows people think goes on in the Linux world? Even by the late 1990s this was rare.
You aren't compiling kernels in Windows, are you? Where do ideas like this come from?
-
@PSX_Defector said:
What you are talking about is outside of "Linux" and is done at the distro level. Hence why Ubuntu default installs garbage and DSL does not. RPM/apt/YaST, all of that is distro specific which controls that, it's not the Linux kernel doing it. And it's exactly the way that Windows handles it as well.
Right. Windows has one distro, Windows and it lacks RPM, APT and YAST style functionality. We are comparing the enterprise Windows release to enterprise Linux options (RHEL/CentOS, Suse and Ubuntu.)
-
@PSX_Defector said:
Just because your binary package management tool takes care of all of that doesn't mean that it's not being done. Dependencies and various other stuff for third party applications are perfectly built into the MSI installer process. It knows how to check and what to get, as long as the installer is setup to do it.
How does MSI do this? There isn't a centralized repo of the necessary software for this. Sure, MSI can reach out and start downloading from third party sites, anything can do that. But how do you do it through a managed repo system?
Any why is no one doing this on Windows if MSI has it baked in? In the Linux distros, the power to do this is included in the OS. The toolsets are inclusive. How does MSI handle the DLL overlaps, versioning, automatic updates, repo management, etc.? How does this exist in Windows and no one know about it?
I realize that the biggest stumbling block to the Windows world is the human ecosystem that they have and that simple things like the existence of the RSAT can go undetected even by large, rich Windows admin departments, but after decades of comparisons between Linux package management and Windows lack thereof, why has absolutely no one come forth to propose that Windows in fact has this and just no one knows?
So... give me an example. Where do I run an install command that installs a trusted repo or similar, get's a major package that I need, installs third party dependencies in a trusted manner and then keeps them patches using tools from the OS?
-
@scottalanmiller said:
So... give me an example. Where do I run an install command that installs a trusted repo or similar, get's a major package that I need, installs third party dependencies in a trusted manner and then keeps them patches using tools from the OS?
Spiceworks. It installs what it needs, updates all the necessary components when necessary, and brings it up to date when necessary.
-
@scottalanmiller said:
@PSX_Defector said:
What you are talking about is outside of "Linux" and is done at the distro level. Hence why Ubuntu default installs garbage and DSL does not. RPM/apt/YaST, all of that is distro specific which controls that, it's not the Linux kernel doing it. And it's exactly the way that Windows handles it as well.
Right. Windows has one distro, Windows and it lacks RPM, APT and YAST style functionality. We are comparing the enterprise Windows release to enterprise Linux options (RHEL/CentOS, Suse and Ubuntu.)
And which we have SCCM or even WSUS or Window Update, which keeps up to date versus the whims of whatever repository where we are pointing to.
The reality of the world is that folks don't like to update. I had one customer who I had to cut over to a new server but used a beta version of PHP; Took a lot of convincing to change it up. Updates are good, but in the reality of SMB, you cannot find any way to update shit code.
-
@PSX_Defector said:
@scottalanmiller said:
So... give me an example. Where do I run an install command that installs a trusted repo or similar, get's a major package that I need, installs third party dependencies in a trusted manner and then keeps them patches using tools from the OS?
Spiceworks. It installs what it needs, updates all the necessary components when necessary, and brings it up to date when necessary.
It's doing it itself. It has to run its own checks and its own software to do it. If Spiceworks is not running, it won't update. This is outside of the Windows system so doesn't count. It's valid that you can do this "in spite of Windows", but Windows is not providing the mechanism and lacking a standard mechanism for patching makes this just plain silly. Yes, it is great that SW and Chrome have written extra software to work around this (actually, does SW do this? I've never seen it be able to do this) but it is unique to each app and doesn't meet the most basic qualifications for this discussion.
Also, how do you query all of these apps to see if they've really been updated? And how does SW do this if it has been secured and is unable to call home?
-
@PSX_Defector said:
And which we have SCCM or even WSUS or Window Update, which keeps up to date versus the whims of whatever repository where we are pointing to.
So again.... either you buy an extra product like SCCM to fill in the missing functionality (this supports what I'm saying that it is lacking in the product itself) or you are stuck with WSUS which adds management but not functionality since it will not handle the software ecosystem.
So back to what I was saying, Windows doesn't do what Linux does by any stretch. I'm unclear if you actually don't realize how much Linux does and what makes it safe, useful and powerful or if you are just fooling around and being sarcastic that Windows people think Windows does some of these things?