ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    BackUp device for local or colo storage

    IT Discussion
    backup disaster recovery
    7
    195
    89.0k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • scottalanmillerS
      scottalanmiller @DustinB3403
      last edited by

      @DustinB3403 said:

      1Gbe

      1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

      DashrenderD 2 Replies Last reply Reply Quote 0
      • DustinB3403D
        DustinB3403
        last edited by

        Even in relative idle times this servers slow. I don't know how old it even is. 6-8 years maybe

        scottalanmillerS 1 Reply Last reply Reply Quote 0
        • scottalanmillerS
          scottalanmiller
          last edited by

          Realistically, you need a core backup infrastructure of 10Gb/s in a bonded pair which would drop your network bottleneck from 21.2 hours to 1.05 hours. Of course other bottlenecks will be exposed. But this is key. Your fundamental network infrastructure cannot handle your backup needs. This means you cannot restore in an emergency either. Nothing you do will speed it up, waiting a full day minimum would be your only option. And likely you would need to do a lot of different things at once and be very unable to keep the line fully saturated for a full day while doing the restore.

          1 Reply Last reply Reply Quote 0
          • DashrenderD
            Dashrender @scottalanmiller
            last edited by Dashrender

            @scottalanmiller said:

            @DustinB3403 said:

            1Gbe

            1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

            Well, then he's actually doing pretty good, if he says it takes around 24 hours to backup the whole system (all current 6 TB). He might have a bottle neck somewhere, but not a horrible one.

            1 Reply Last reply Reply Quote 0
            • scottalanmillerS
              scottalanmiller @DustinB3403
              last edited by

              @DustinB3403 said:

              Even in relative idle times this servers slow. I don't know how old it even is. 6-8 years maybe

              Given that 2003 R2 came out in 2005, it is presumably 10+ years old.

              1 Reply Last reply Reply Quote 0
              • DashrenderD
                Dashrender @scottalanmiller
                last edited by

                @scottalanmiller said:

                1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

                This math alone proves that using NAUBackup to create full backups won't really be much better than the current solution. Definitely sounds like it's time for a network upgrade.

                1 Reply Last reply Reply Quote 1
                • DustinB3403D
                  DustinB3403
                  last edited by

                  Or just a dedicated 10Gig switch for the management port on Xen and the onsite backup solutions.

                  1 Reply Last reply Reply Quote 0
                  • DustinB3403D
                    DustinB3403
                    last edited by

                    Of course I'd have to put 10Gbe NICs into the host servers.

                    scottalanmillerS 1 Reply Last reply Reply Quote 1
                    • scottalanmillerS
                      scottalanmiller @DustinB3403
                      last edited by

                      @DustinB3403 said:

                      Of course I'd have to put 10Gbe NICs into the host servers.

                      Not necessarily, you only need your aggregate to be faster. I'm assuming that bonded NICs have not been set up? Get that fixed. If every server was 2Gb/s and the backup host was 10Gb/s you'd take rather an amazing leap forward just there. Probably enough to find other bottlenecks.

                      1 Reply Last reply Reply Quote 0
                      • scottalanmillerS
                        scottalanmiller
                        last edited by

                        If you identify a single server that needs more speed, you can go up to triple or quadruple GigE if need be before making a leap to 10GigE connections.

                        DashrenderD 1 Reply Last reply Reply Quote 0
                        • scottalanmillerS
                          scottalanmiller
                          last edited by

                          You might find a single server or two with 10GigE needs, but likely not the majority. Spend opportunistically.

                          1 Reply Last reply Reply Quote 0
                          • scottalanmillerS
                            scottalanmiller
                            last edited by

                            Might as well loop in StorageCraft themselves too: @Steven

                            1 Reply Last reply Reply Quote 0
                            • DashrenderD
                              Dashrender @scottalanmiller
                              last edited by

                              @scottalanmiller said:

                              If you identify a single server that needs more speed, you can go up to triple or quadruple GigE if need be before making a leap to 10GigE connections.

                              What's the current cost for a 10 GigE card. Assuming he doesn't already have open ports of GigE, he'll need to buy regardless.

                              1 Reply Last reply Reply Quote 0
                              • DashrenderD
                                Dashrender
                                last edited by Dashrender

                                I'm surprised, an unmanged 10 GigE swith 8 port is $760

                                http://www.newegg.com/Product/Product.aspx?Item=N82E16833122529

                                A two port card from Dell is $650. Third party might be considerably less.

                                1 Reply Last reply Reply Quote 1
                                • scottalanmillerS
                                  scottalanmiller
                                  last edited by

                                  Yup, I've been pushing Netgear 10GigE for a long time now. I think that Dell has some decent 10GigE fiber switches for around $2K as well.

                                  1 Reply Last reply Reply Quote 0
                                  • DashrenderD
                                    Dashrender
                                    last edited by

                                    Damn the 12 port vs the 8 port is double the price of the 8 port, $1450.. ouch!

                                    scottalanmillerS 1 Reply Last reply Reply Quote 0
                                    • scottalanmillerS
                                      scottalanmiller @Dashrender
                                      last edited by

                                      @Dashrender said:

                                      Damn the 12 port vs the 8 port is double the price of the 8 port, $1450.. ouch!

                                      I bet if you check the backplane gets a lot faster.

                                      1 Reply Last reply Reply Quote 0
                                      • DustinB3403D
                                        DustinB3403
                                        last edited by

                                        It might be the route we go with, 10GigE switch with bonded NICS or dedicated 10GigE NICS on each host.

                                        1 Reply Last reply Reply Quote 0
                                        • DashrenderD
                                          Dashrender
                                          last edited by

                                          How many VM hosts do you have?

                                          1 Reply Last reply Reply Quote 0
                                          • DustinB3403D
                                            DustinB3403
                                            last edited by

                                            1 Currently that is stand alone.

                                            The equipment we're looking into would be a dual host setup. "Primary Primary" so to speak.

                                            1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 6
                                            • 7
                                            • 8
                                            • 9
                                            • 10
                                            • 5 / 10
                                            • First post
                                              Last post