ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Syncing massive amounts of changing data to BackBlaze B2 via Linux

    Scheduled Pinned Locked Moved IT Discussion
    b2file storagethoughtworksactive backupdisaster avoidance
    17 Posts 6 Posters 1.0k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • 1
      1337 @DustinB3403
      last edited by 1337

      @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

      There are a multiple ways of doing this with tools like Duplicati, CloudBerry etc etc etc (the integrations page goes on forever).

      In any scenario, if you had a high turnover SMB share with large files (some of which might be 10GB+ individual files) and multiple terabytes worth of change (in a week) - how would you go about getting the data to B2.

      Down is the other half of the battle, which can be discussed afterwards.

      Using a command line tool like rsync is one option, although I'm not sure how effective it would over a long duration.

      To me it does sound like a cloud backup solution is the wrong solution for that use case.

      Have you done the math? Multiple of TBs each week, say 5 TB per week, that is 700 GB per day, 30 GB per hour, 500 MB per minute or or 8 MB per second. So you need an average of 80 Megabit per sec 24/7 to upload that amount of data.

      DustinB3403D 1 Reply Last reply Reply Quote 0
      • DustinB3403D
        DustinB3403 @1337
        last edited by

        @Pete-S said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

        @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

        There are a multiple ways of doing this with tools like Duplicati, CloudBerry etc etc etc (the integrations page goes on forever).

        In any scenario, if you had a high turnover SMB share with large files (some of which might be 10GB+ individual files) and multiple terabytes worth of change (in a week) - how would you go about getting the data to B2.

        Down is the other half of the battle, which can be discussed afterwards.

        Using a command line tool like rsync is one option, although I'm not sure how effective it would over a long duration.

        To me it does sound like a cloud backup solution is the wrong solution for that use case.

        Have you done the math? Multiple of TBs each week, say 5 TB per week, that is 700 GB per day, 30 GB per hour, 500 MB per minute or or 8 MB per second. So you need an average of 80 Megabit per sec 24/7 to upload that amount of data.

        Bandwidth isn't an issue, the goal is to offload the data once the working files are collected and to simply store them in a safe relatively low cost space without having to build something.

        I too immediately understand onsite backup would be great, but also unrealistic to build as the cost of the storage alone would be far too high.

        1 1 Reply Last reply Reply Quote 0
        • 1
          1337 @DustinB3403
          last edited by

          @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

          @Pete-S said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

          @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

          There are a multiple ways of doing this with tools like Duplicati, CloudBerry etc etc etc (the integrations page goes on forever).

          In any scenario, if you had a high turnover SMB share with large files (some of which might be 10GB+ individual files) and multiple terabytes worth of change (in a week) - how would you go about getting the data to B2.

          Down is the other half of the battle, which can be discussed afterwards.

          Using a command line tool like rsync is one option, although I'm not sure how effective it would over a long duration.

          To me it does sound like a cloud backup solution is the wrong solution for that use case.

          Have you done the math? Multiple of TBs each week, say 5 TB per week, that is 700 GB per day, 30 GB per hour, 500 MB per minute or or 8 MB per second. So you need an average of 80 Megabit per sec 24/7 to upload that amount of data.

          Bandwidth isn't an issue, the goal is to offload the data once the working files are collected and to simply store them in a safe relatively low cost space without having to build something.

          I too immediately understand onsite backup would be great, but also unrealistic to build as the cost of the storage alone would be far too high.

          I do understand what you're saying but I do think bandwidth is an issue. You might have the bandwidth but do you have that bandwidth consistently 24/7 all the way to Backblaze servers?

          DustinB3403D 1 Reply Last reply Reply Quote 0
          • DustinB3403D
            DustinB3403 @1337
            last edited by

            @Pete-S 1Gbe symmetric 24/7

            1 1 Reply Last reply Reply Quote 0
            • 1
              1337 @DustinB3403
              last edited by

              @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

              @Pete-S 1Gbe symmetric 24/7

              So when you upload to Backblaze you get 1Gbit/s?

              DustinB3403D 1 Reply Last reply Reply Quote 0
              • DustinB3403D
                DustinB3403 @1337
                last edited by

                @Pete-S said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                @Pete-S 1Gbe symmetric 24/7

                So when you upload to Backblaze you get 1Gbit/s?

                I haven't specifically checked, but when we get to L3 were do have 1GBe.

                1 1 Reply Last reply Reply Quote 0
                • 1
                  1337 @DustinB3403
                  last edited by 1337

                  @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                  @Pete-S said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                  @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                  @Pete-S 1Gbe symmetric 24/7

                  So when you upload to Backblaze you get 1Gbit/s?

                  I haven't specifically checked, but when we get to L3 were do have 1GBe.

                  You could do a simple test here:
                  https://www.backblaze.com/speedtest/

                  I'm not sure it will tell the complete story though. I understand that Backblaze only has one datacenter i Sacramento, California. I don't know how many hops away that is for you. Any congestion, traffic shaping etc on the way will lower your bandwidth.

                  DustinB3403D 1 Reply Last reply Reply Quote 1
                  • DustinB3403D
                    DustinB3403 @1337
                    last edited by DustinB3403

                    @Pete-S At my workstation I'm getting 225Mbit/s down and 155Mbit/s up (clearly not symmetrical there. . .) but not bad either considering I have nothing special configured for my workstation.

                    On a second test I noticed this A connection of 152.8 Mbps upload would backup 1,650 GB in a day

                    So this very well could be feasible to do.

                    DashrenderD 1 2 Replies Last reply Reply Quote 0
                    • DashrenderD
                      Dashrender @DustinB3403
                      last edited by

                      @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                      @Pete-S At my workstation I'm getting 225Mbit/s down and 155Mbit/s up (clearly not symmetrical there. . .) but not bad either considering I have nothing special configured for my workstation.

                      On a second test I noticed this A connection of 152.8 Mbps upload would backup 1,650 GB in a day

                      So this very well could be feasible to do.

                      what you get is totally dependent upon so many factors - and you know you can't control those factors over the internet.

                      DustinB3403D 1 Reply Last reply Reply Quote 0
                      • DustinB3403D
                        DustinB3403 @Dashrender
                        last edited by

                        @Dashrender said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                        @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                        @Pete-S At my workstation I'm getting 225Mbit/s down and 155Mbit/s up (clearly not symmetrical there. . .) but not bad either considering I have nothing special configured for my workstation.

                        On a second test I noticed this A connection of 152.8 Mbps upload would backup 1,650 GB in a day

                        So this very well could be feasible to do.

                        what you get is totally dependent upon so many factors - and you know you can't control those factors over the internet.

                        I understand that, but those speeds meet/exceed what would be created within a week. Which if the backup process took 2-3 days to complete that would be fine.

                        dafyreD 1 Reply Last reply Reply Quote 0
                        • ObsolesceO
                          Obsolesce
                          last edited by

                          giphy.gif

                          1 Reply Last reply Reply Quote 0
                          • dafyreD
                            dafyre @DustinB3403
                            last edited by

                            @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                            @Dashrender said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                            @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                            @Pete-S At my workstation I'm getting 225Mbit/s down and 155Mbit/s up (clearly not symmetrical there. . .) but not bad either considering I have nothing special configured for my workstation.

                            On a second test I noticed this A connection of 152.8 Mbps upload would backup 1,650 GB in a day

                            So this very well could be feasible to do.

                            what you get is totally dependent upon so many factors - and you know you can't control those factors over the internet.

                            I understand that, but those speeds meet/exceed what would be created within a week. Which if the backup process took 2-3 days to complete that would be fine.

                            If you already have B2, the best thing you could do, I think is run it for a week and see how far it makes it.

                            1 Reply Last reply Reply Quote 0
                            • 1
                              1337 @DustinB3403
                              last edited by 1337

                              @DustinB3403 said in Syncing massive amounts of changing data to BackBlaze B2 via Linux:

                              @Pete-S At my workstation I'm getting 225Mbit/s down and 155Mbit/s up (clearly not symmetrical there. . .) but not bad either considering I have nothing special configured for my workstation.

                              On a second test I noticed this A connection of 152.8 Mbps upload would backup 1,650 GB in a day

                              So this very well could be feasible to do.

                              Yes, that's not too bad. It could work. As @dafyre and other mentioned you should give it a try.
                              $.005 per GB is $5 per TB. So get an account and upload 2TB of random data to see how long it takes. Only going to cost you 10 bucks to find out.

                              1 Reply Last reply Reply Quote 0
                              • 1 / 1
                              • First post
                                Last post