@jt1001001 said in Printer Leasing/Maintenance - Installing Software on the Network for Monitoring Print Devices:
We have Xerox leased copiers/printers
I pity you.
@jt1001001 said in Printer Leasing/Maintenance - Installing Software on the Network for Monitoring Print Devices:
We have Xerox leased copiers/printers
I pity you.
No dice just yet, finishing a beer and headed home. Going to see if this can get resolved tomorrow.
Turns out this may be a bug in rclone that potentially is fixed in https://beta.rclone.org/branch/v1.49.0-041-g6ade4a26-fix-already-closed-beta/
Testing now approximately 2 minutes to find out. (of course it could just have worked this time who honestly knows).
Maybe just drafting policy to have the backup checked would make the most sense, but I'm not sure how / where that would work as a lot of these things are just going to be a dumping ground of files.
Manually uploading the file with Cyberduck did work, so I'm not sure where to go with this. . .
So adjusting the ServerAliveInterval seems to have done the trick, it's a simple and small change; at least as far as I've tested since this morning.
But!
I'm having a hell of a time with rclone and this 1 damn file, it keeps failing to sync this 1 file to B2 for some god forsaken reason and is really ticking me off and ruining my PoC.
It'll copy the file to B2 to 100% and then start over again for some reason. . . . the file is almost 4GB (but this is smaller than other files that have finished) and the logs don't really point to a specific reason this 1 file is not syncing.
ffs!
Looks like I'll need to make 1 small edit to each workstation's ssh_config file.
ServerAliveInterval 120
as an addition should do. ConnectionTimeout 0
is a default that's disabled as well, so that may have some issues.
Will test and report back.
So with doing some more digging and running rclone in verbose mode, this appears to be an issue with the workstation closing the SSH connection.
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:46 ERROR : sftp://user@ip:22//Volumes/G-SPEED Shuttle TB3/Client/: Discarding closed SSH connection: EOF
2019-09-10 10:23:47 ERROR : Becoming Client/Renders/first4.mov: Failed to copy: failed to open source object: Open: couldn't connect SSH: dial tcp ip:22: connect: connection refused
2019-09-10 10:23:47 ERROR : B2 bucket G-Tech: not deleting files as there were IO errors
2019-09-10 10:23:47 ERROR : B2 bucket G-Tech: not deleting directories as there were IO errors
2019-09-10 10:23:47 ERROR : Attempt 3/3 failed with 3 errors and: failed to open source object: Open: couldn't connect SSH: dial tcp ip:22: connect: connection refused
Transferred: 15.329G / 15.329 GBytes, 100%, 3.126 MBytes/s, ETA 0s
Errors: 3 (retrying may help)
Checks: 637 / 637, 100%
Transferred: 1 / 1, 100%
Elapsed time: 1h23m41.8s
2019/09/10 10:23:47 Failed to sync with 3 errors: last error was: failed to open source object: Open: couldn't connect SSH: dial tcp ip:22: connect: connection refused
Anyone have any idea on how to better handle this?
Worth nothing that a least a few of the files that error'd with the Post code are being sync'd now, so is the Post error something to simply ignore?
This doesn't occur on any pattern of a file, large files or small files seem to be randomly affected or not. Running the sync again some times seems to work. Is there a better approach to syncing that I should be looking at?
@BackblazeNathan Hoping you can help me to understand what's going on. I have rclone performing a PoC sync from an external volume that is backing up into our B2 account and bucket.
On a few files I'll get an ERROR: File/Name Failed to copy: Post long url: file already closed
Based on the documentation here the error code means the "method is not allowed".
What does this actually mean, and is there anyways to quickly resolve said issue(s)?
@pmoncho said in With ESXi Licensing what happens if I let it lapse:
I believe that is why software updates is only $60 per year. That is dirt cheap considering it is ESXi.
Dirt cheap would be $0/year forever like with XCP-ng and KVM.
I'd happily offer my support for it but I can't make any promises on the quality of the support as I have a day job.
IE I don't work for an MSP where my job is to support others.
@carnival-boy said in Apple plans to scan your images for child porn:
No I wasn't. I said that the scanning was done on the phone, but that Apple can't access this data.
How can they not access the phone? They must be keeping a copy of the hash records somewhere, and then comparing it with what is on the phone. Maybe they are keeping a record of the hashes on the phone as well, but this seems unlikely.
@dashrender said in I can't even:
@dustinb3403 said in I can't even:
@dashrender said in I can't even:
@dustinb3403 said in I can't even:
@voip_n00b said in I can't even:
@dustinb3403 I blocked port 80. Give me 443 or nothing.
And how are you blocking that while you're mobile? I can understand from a location you have control over, but roaming seems to be a challenge and doesn't fix the issue.
Assuming there's a firewall available for your mobile device, you could conceivably do it.
Conceivability verse practicality is what I'm wondering about, could someone setup a firewall to use for their cellphone etc, sure but would you really want to?
Of course not - who wants to manage that?
but the same can be said for boycotting those that don't use HTTPS - if you care enough, you're potentially already doing it - if you don't, well you never will.
I will not be purchasing smithfield products if they can't provide the lowest level of security across their sites, that is a go forward rule for me.
Do you know all of the products/vendors you use and can confirm that their sites are secured and would you do such a thing?
@mmicha said in Where to start...:
@dustinb3403 As far as our file server goes, it mostly holds word, excel, pdf's, and yes autocad files our engineers create/use.
The IIS system runs an internal costing and sales app. Our actual website is hosted elsewhere.
WDS I figured wouldn't work, but I also considered intune with M365 for that possibly.
The accounting system is Sage ERP 300. It's a POS in my opinion.
Our internet can scale, we are on fiber at at 50mbps currently.
Would you see any benefit to a split environment of cloud and on-premise if things like autocad files became slow to open?
I would actually draw up plans for a split environment for just that use case.
What workloads would easily run offsite, IIS, SQL, etc and what systems benefit the most from being onsite.
File servers that hold CAD files won't necessarily require being on site, but its worth considering depending on how these services are used. A simple hypervisor (or pair) with enough capacity and performance virtualized and setup to replicate between the sites (and a backup offsite BackBalzze B2 AWS Storage etc etc) could offer the best performance, while giving you a high level of reliability.
Of course you'd have to take into consideration things like internet capacity, backup systems, power considerations etc. INAP could very well be a single datacenter (I don't know and didn't look) that while it likely has all of the above, if a site outage occurs, you'd be out of business as well.
Microsoft’s sneaky plan to switch Chrome searches from Google to Bing
Lol... BING is dead, Microsoft is attempting to force it on people...
So when do we start boycotting businesses that refuse to use SSL...
@mmicha said in Where to start...:
The idea of the cloud is mostly due to a sister company shifting to INAP / Single Hop
Got it, so they are moving to a colo service that offers cloud services. There are definitely benefits (as outlined above) but there are also negatives to making this move.
From what you've listed I don't see anything that may have a major impact on the cost, besides possibly the file and sql servers.
Depending on the Cloud provider (INAP in this case) you may save some costs long or may end up stuck there with no easy means of moving your environment to another platform.
@IRJ is of the mindset that everything I say is stupid or insane but the consideration needs to be understood.
The DC's are completely minimal to operate, the tiniest of VMs run anywhere should suffice as they don't have any heavy workload.
The Fileserver depends, based on what kind of files you're hosting, cad files or word excel powerpoint type stuff. Then of course there is the issue of your internet connection to and from the Colo for this that has to be considered, on the LAN you're likely using 1GBe throughout (if not higher speeds). Is your ISP capable of supporting that (if so at what cost)?
With SQL I'd have to know how it's being used but this is also easily moved, and can be scaled.
I ask this only because I'm curious why bother using IIS today, company website or some other service? It might make sense to move this workload to a website hosting provider entirely.
I'm assuming WDS is for Windows Deployment Services - you may have major performance issues getting this to work over an internet connection.
What accounting system, quickbooks?
@Obsolesce said in Miscellaneous Tech News:
Deepin Linux may ship with an AI Voice Assistant:
I read this yesterday on my phone, it seems like a really great addon feature. Not that I need an AI assistant, but I know people who use these regularly.