ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    BackUp device for local or colo storage

    Scheduled Pinned Locked Moved IT Discussion
    backupdisaster recovery
    195 Posts 7 Posters 91.2k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • DashrenderD
      Dashrender
      last edited by Dashrender

      No wonder you have issues!

      So the iSCSI traffic for the Buffalo goes over the same NIC as the traffic being sent to the 2 Synology devices?

      And it's all driven by the StorageCraft software that's running on the Server 2003 box?

      What does the Buffalo device do that's different than the 2 Synology devices?

      Is the Buffalo the primary storage, boot storage, etc, for the Server 2008?

      1 Reply Last reply Reply Quote 0
      • DustinB3403D
        DustinB3403
        last edited by

        The iSCSI target is housing our network shares.

        The buffalo is being decommissioned but it was a backup device.

        1 Reply Last reply Reply Quote 0
        • scottalanmillerS
          scottalanmiller @DustinB3403
          last edited by

          @DustinB3403 said:

          We also have an ancient "archive server" which has 6 drives, running Server 2003 which actually runs the Storage Craft software. Single NIC connected, 1Gbe, 8GB RAM with a Quad Core AMD Opteron 1385 CPU.

          So everything flows through this machine? All 8TB of backups goes through this choke point? Have you checked CPU to see if it is maxed out? Memory to see if it is exhausted? IOPS to see if you are beyond the limits of the drives?

          1 Reply Last reply Reply Quote 0
          • DustinB3403D
            DustinB3403
            last edited by

            The server 2003 is horribly slow. CPU usage is constantly peaking. Memory usage doesn't seem to be hit very hard.

            But this device is also looking to be tossed. I was considering just using it for drive space as just another backup of our backup sort of device.

            Maybe not?

            1 Reply Last reply Reply Quote 0
            • scottalanmillerS
              scottalanmiller @DustinB3403
              last edited by

              @DustinB3403 said:

              1Gbe

              1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

              DashrenderD 2 Replies Last reply Reply Quote 0
              • DustinB3403D
                DustinB3403
                last edited by

                Even in relative idle times this servers slow. I don't know how old it even is. 6-8 years maybe

                scottalanmillerS 1 Reply Last reply Reply Quote 0
                • scottalanmillerS
                  scottalanmiller
                  last edited by

                  Realistically, you need a core backup infrastructure of 10Gb/s in a bonded pair which would drop your network bottleneck from 21.2 hours to 1.05 hours. Of course other bottlenecks will be exposed. But this is key. Your fundamental network infrastructure cannot handle your backup needs. This means you cannot restore in an emergency either. Nothing you do will speed it up, waiting a full day minimum would be your only option. And likely you would need to do a lot of different things at once and be very unable to keep the line fully saturated for a full day while doing the restore.

                  1 Reply Last reply Reply Quote 0
                  • DashrenderD
                    Dashrender @scottalanmiller
                    last edited by Dashrender

                    @scottalanmiller said:

                    @DustinB3403 said:

                    1Gbe

                    1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

                    Well, then he's actually doing pretty good, if he says it takes around 24 hours to backup the whole system (all current 6 TB). He might have a bottle neck somewhere, but not a horrible one.

                    1 Reply Last reply Reply Quote 0
                    • scottalanmillerS
                      scottalanmiller @DustinB3403
                      last edited by

                      @DustinB3403 said:

                      Even in relative idle times this servers slow. I don't know how old it even is. 6-8 years maybe

                      Given that 2003 R2 came out in 2005, it is presumably 10+ years old.

                      1 Reply Last reply Reply Quote 0
                      • DashrenderD
                        Dashrender @scottalanmiller
                        last edited by

                        @scottalanmiller said:

                        1Gb/s has a realistic maximum transfer rate of 800Mb/s and that would be HARD to hit and sustain. 8TB on 1Gb/s is 21.2 hours to copy. That's with zero bottlenecks anywhere, just wide open streaming without ever dropping the speed.

                        This math alone proves that using NAUBackup to create full backups won't really be much better than the current solution. Definitely sounds like it's time for a network upgrade.

                        1 Reply Last reply Reply Quote 1
                        • DustinB3403D
                          DustinB3403
                          last edited by

                          Or just a dedicated 10Gig switch for the management port on Xen and the onsite backup solutions.

                          1 Reply Last reply Reply Quote 0
                          • DustinB3403D
                            DustinB3403
                            last edited by

                            Of course I'd have to put 10Gbe NICs into the host servers.

                            scottalanmillerS 1 Reply Last reply Reply Quote 1
                            • scottalanmillerS
                              scottalanmiller @DustinB3403
                              last edited by

                              @DustinB3403 said:

                              Of course I'd have to put 10Gbe NICs into the host servers.

                              Not necessarily, you only need your aggregate to be faster. I'm assuming that bonded NICs have not been set up? Get that fixed. If every server was 2Gb/s and the backup host was 10Gb/s you'd take rather an amazing leap forward just there. Probably enough to find other bottlenecks.

                              1 Reply Last reply Reply Quote 0
                              • scottalanmillerS
                                scottalanmiller
                                last edited by

                                If you identify a single server that needs more speed, you can go up to triple or quadruple GigE if need be before making a leap to 10GigE connections.

                                DashrenderD 1 Reply Last reply Reply Quote 0
                                • scottalanmillerS
                                  scottalanmiller
                                  last edited by

                                  You might find a single server or two with 10GigE needs, but likely not the majority. Spend opportunistically.

                                  1 Reply Last reply Reply Quote 0
                                  • scottalanmillerS
                                    scottalanmiller
                                    last edited by

                                    Might as well loop in StorageCraft themselves too: @Steven

                                    1 Reply Last reply Reply Quote 0
                                    • DashrenderD
                                      Dashrender @scottalanmiller
                                      last edited by

                                      @scottalanmiller said:

                                      If you identify a single server that needs more speed, you can go up to triple or quadruple GigE if need be before making a leap to 10GigE connections.

                                      What's the current cost for a 10 GigE card. Assuming he doesn't already have open ports of GigE, he'll need to buy regardless.

                                      1 Reply Last reply Reply Quote 0
                                      • DashrenderD
                                        Dashrender
                                        last edited by Dashrender

                                        I'm surprised, an unmanged 10 GigE swith 8 port is $760

                                        http://www.newegg.com/Product/Product.aspx?Item=N82E16833122529

                                        A two port card from Dell is $650. Third party might be considerably less.

                                        1 Reply Last reply Reply Quote 1
                                        • scottalanmillerS
                                          scottalanmiller
                                          last edited by

                                          Yup, I've been pushing Netgear 10GigE for a long time now. I think that Dell has some decent 10GigE fiber switches for around $2K as well.

                                          1 Reply Last reply Reply Quote 0
                                          • DashrenderD
                                            Dashrender
                                            last edited by

                                            Damn the 12 port vs the 8 port is double the price of the 8 port, $1450.. ouch!

                                            scottalanmillerS 1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 6
                                            • 7
                                            • 8
                                            • 9
                                            • 10
                                            • 5 / 10
                                            • First post
                                              Last post