• 0 Votes
    37 Posts
    3k Views
    ObsolesceO

    @gjacobse said in Designing for tech startup: Network, AD, Backup etc:

    @DustinB3403 said in Designing for tech startup: Network, AD, Backup etc:

    I suppose you could use Storage Spaces Direct (all windows across the entire thing) but I wouldn't consider SSD at all mature nor production ready, especially at this scale.

    Thanks, had not heard of this.

    DataOn solutions fully support this and vice versa. They are experienced with this kind of scale and much larger.

  • 0 Votes
    20 Posts
    2k Views
    J

    @Dashrender said in Domain Planning: Network shares or ,..:

    @dafyre said in Domain Planning: Network shares or ,..:

    @notverypunny said in Domain Planning: Network shares or ,..:

    Does NC allow exposure of their "file shares" as smb? If you have users that can't / don't want to use a browser-based access they can always mount it in windows explorer via webdav. Alfresco allows (allowed?) access via both, but the last time I played with it the performance was meh, which I attributed to it being built on java...

    You can mount NextCloud into a drive letter or folder using WebDav.

    The question does become the aforementioned performance issue (if there is one).

    I wonder how file locks are handled when using WebDav?

    There are a few topics elsewhere here where file locking and cloud hosting were discussed. You do have to give up what we have all come to appreciate in file locking. Here is a response in one of those other topics I spoke about:

    @scottalanmiller said in file sharing in the 21st century:

    @Donahue said in file sharing in the 21st century:

    I am aware of that. It's online locking that I am after. Though, I will concede that any locking scheme has to plan for both online and offline. I like sync because of local performance and offline availability, but it really feels like it is best for non shared files. When you add multiple users into the mix, almost everything goes out the window, especially when and if they go offline.

    Everything is best for non-shared files 🙂

    SMB shines at "always online, always nearly local" files because it handles offline so poorly. It's a balance. To handle offline or very distant (e.g. high latency) networks well, you have to sacrifice locking.

  • 3 Votes
    6 Posts
    983 Views
    dbeatoD

    @scottalanmiller said in Server 2012 PS: Script to find OU path:

    @dbeato he's looking for NTG admin accounts on someone else's domain.

    Good!

  • 0 Votes
    5 Posts
    813 Views
    gjacobseG

    Since I was working with only ONE user this is what I needed to change it to, else was getting Parameter set cannot be resolved

    Import-module ActiveDirectory Get-ADUser -Filter {Name -eq "SomeUser"} -SearchBase "OU=Users,OU=OUGroup,DC=DOMAINname,DC=com" | Set-ADUser –scriptPath “\\SERVERNAME\netlogon\2018ADUC-script.txt”

    But it worked!

  • Domain Computers: Clock Sync

    IT Discussion
    11
    0 Votes
    11 Posts
    1k Views
    dbeatoD

    @gjacobse said in Domain Computers: Clock Sync:

    DC4 is a virtual machine. changes to it are likely over rode by the Hypervisor or physical hardware that were wrong.

    By updating the physical hardware, and then running w32tm /resync the time updated.

    The issue will resurface again as the BIOS is losing time and will be pulled again wrongly on the host. Have you tried not syncing the time of the VM through the Hypervisor and turning of Time Synchronization for this DC?

  • 0 Votes
    5 Posts
    1k Views
    travisdh1T

    @eddiejennings said in Logging Domain user authentication failures:

    @travisdh1 said in Logging Domain user authentication failures:

    @eddiejennings No OSSEC, Wazuh, or some other security monitoring available? All of them monitor logins by default that I've looked at. Should be easy to customize a report for whatever you need.

    I haven't had to set this up in a Windows environment yet, so I'm also curious as to what you end up doing.

    We do have ExtraHop; however, it's not capturing all the traffic it should (and another team is in charge of its configuration), so using auditing on the domain controllers is a bit of a stop-gap measure.

    Ah. What an ..... effective use of resources.

    Good luck, ExtraHop is very nice, but like every other tool, it's useless untill deployed properly.

  • 0 Votes
    5 Posts
    951 Views
    DustinB3403D

    @gjacobse

    You should just need to run this bit, nothing should have to be changed. DFL should be at least 2008

    # Imports Active Directory information Import-Module Activedirectory $credentials = Get-Credential # Prompts for user credentials default user is “ ”, enter an administrator account in the form of “domain-name\administrator-account” Get-ADUser -Credential $credentials -Filter * -Properties DisplayName,EmailAddress,memberof,DistinguishedName,Enabled | % { New-Object PSObject -Property @{ UserName = $_.DisplayName EmailAddress = $_.EmailAddress DistinguishedName = $_.DistinguishedName Enabled = $_.Enabled # Deliminates the document for easy copy and paste using ";" as the delimiter. Incredibly useful for Copy & Paste of group memberships to new hire employees. Groups = ($_.memberof | Get-ADGroup | Select -ExpandProperty Name) -join ";" } # The export path is variable change to desired location on domain controller or end user computer. } | Select UserName,EmailAddress,@{l='OU';e={$_.DistinguishedName.split(',')[1].split('=')[1]}},Groups,Enabled | Sort-Object Username | Export-Csv $ENV:UserProfile\Documents\User-Permissions.csv –NTI
  • 0 Votes
    2 Posts
    981 Views
    DustinB3403D

    I got you

    # This script will export all users of the specified domain, and their group memberships to a CSV file. The usefulness of this tool is expressed when # setting up new hire employees or reviewing domain membership permissions. # It's not advisable to store the user credentials required to run this script as they can be decrypted. This script is not designed to save these credentials but could be modified to do so. # Use of this script implies that you understand what it does, and will do to with regards to your Active Directory installation members and group memberships. # As designed there are no changes made to your installation, the script simply generates a report of members, and their group memberships. # Any changes to this script are the responsibility of the person/organization which made said changes. # We cannot be held responsible for your misuse or misunderstanding of this script as it was designed. # # # # # Imports Active Directory information Import-Module Activedirectory $credentials = Get-Credential # Prompts for user credentials default user is “ ”, enter an administrator account in the form of “domain-name\administrator-account” Get-ADUser -Credential $credentials -Filter * -Properties DisplayName,EmailAddress,memberof,DistinguishedName,Enabled | % { New-Object PSObject -Property @{ UserName = $_.DisplayName EmailAddress = $_.EmailAddress DistinguishedName = $_.DistinguishedName Enabled = $_.Enabled # Deliminates the document for easy copy and paste using ";" as the delimiter. Incredibly useful for Copy & Paste of group memberships to new hire employees. Groups = ($_.memberof | Get-ADGroup | Select -ExpandProperty Name) -join ";" } # The export path is variable change to desired location on domain controller or end user computer. } | Select UserName,EmailAddress,@{l='OU';e={$_.DistinguishedName.split(',')[1].split('=')[1]}},Groups,Enabled | Sort-Object Username | Export-Csv $ENV:UserProfile\Documents\User-Permissions.csv –NTI #Function Get-SaveFile($initialDirectory) #{ #[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | #Out-Null # #$SaveFileDialog = New-Object System.Windows.Forms.SaveFileDialog #$SaveFileDialog.initialDirectory = $initialDirectory #$SaveFileDialog.filter = "All files (*.*)| *.*" #$SaveFileDialog.ShowDialog() | Out-Null #$SaveFileDialog.filename #} # # # open dialog box to select the .nessuss file. #$InputFile = Get-OpenFile #$OutputFile = Get-SaveFile # # #$Contents = [io.file]::ReadAllText($inputfile) #$Contents = [io.file]::ReadAllText('C:\tools\wd\nessus\data\data.xml') #$Global:OutFile = [System.IO.StreamWriter] "c:\tools\wd\nessus\outfile.csv" # ##$InputFile #$OutputFile #
  • 5 Votes
    1 Posts
    731 Views
    No one has replied
  • 2 Votes
    35 Posts
    5k Views
    gjacobseG

    Sigh - Well - there's the problem. - well likely anyway.

    0_1523795163065_09b3cac7-ae14-4a21-beae-d0ab3194a745-image.png

  • Exchange Shell command not working

    IT Discussion
    18
    0 Votes
    18 Posts
    2k Views
    dbeatoD

    @brianlittlejohn said in Exchange Shell command not working:

    You also need to have export/import permissions granted to your account. They are not granted to anyone by default.

    This is what you need.

    First assign your user the Mailbox Export Role
    New-ManagementRoleAssignment -Role "Mailbox Import Export" -User "<user name or alias>"

    Then make sure the Exchange Subsystem has access to that shared folder and then run your export:
    MailboxExportRequest -Mailbox username -FilePath "\\server\PST FOlder\username.pst"

  • Shadow Protect and Disk IOPS usage..

    IT Discussion
    30
    0 Votes
    30 Posts
    5k Views
    scottalanmillerS

    @MattSpeller said in Shadow Protect and Disk IOPS usage..:

    @scottalanmiller said in Shadow Protect and Disk IOPS usage..:

    @MattSpeller said in Shadow Protect and Disk IOPS usage..:

    @scottalanmiller said in Shadow Protect and Disk IOPS usage..:

    @DustinB3403 said in Shadow Protect and Disk IOPS usage..:

    The read usage on this system is insane for what it is normally doing.

    Even a tiny SW install we recommend some hefty resources and dedicated SSDs. You definitely just figured out the problem. Had you led with this we could have told you instantly what the issue was.

    Our rather beefy SW install runs on rust just fine. Not even a half decent rust array. Old junk.

    I'm surprised. Even when ours was tiny we gave it 100,000 IOPS and it remained slow.

    ¯\(ツ)/¯

    There's only 3 of us in IT here. I suspect more concurrent users would kill it.

    We had probably ten or fifteen and loads of tickets. But that isn't that much of a jump, I wouldn't think.

  • 1 Votes
    13 Posts
    3k Views
    gjacobseG

    In this case, it was a matter of the settings during the restore.

    Under GENERAL you had to select your source and destination as normal. Destination was changed so that it was the new History file. Under FILES you updated the DB and LOG files to reflect the new DB, otherwise you would over write the originals.

    This is where they borked it. They didn't mention FILES only going to OPTIONS, and there is where they mentioned updating the file names. The main discovery was that you need to uncheck LEAVE Source database in the restoring state.

    When I emailed them about removing the 'borked' databases I had created the called me back. I mentioned it to the fellow and we had a short discussion on the matter where he took notes and agreed that the directions were incorrect. When I got to the part about unchecking LEAVE Source database in the restoring state he mentioned that he uncheck Take tail-log backup before restore

    Hope this helps.

    0_1460118217976_2016-04-08 08_22_29-NTG - SSI-SQL01 - Connected.png

  • 0 Votes
    6 Posts
    2k Views
    JaredBuschJ

    If it is a restart of the Hyper-V host, then check the services once it fails to respond like that.

  • WDS Remove legacy Images

    Solved IT Discussion
    5
    1 Votes
    5 Posts
    1k Views
    scottalanmillerS

    Marked as answered.

  • 3 Votes
    16 Posts
    3k Views
    gjacobseG

    @coliver said:

    repadmin /replicate DC1 DC2 dc=domain,dc=com

    So I have have limited success.. but success was made. I the user I created on the other end,.. is now on the local AD.

    I'll look at this more,.. but think it's otherwise solved.

  • 0 Votes
    14 Posts
    3k Views
    coliverC

    @Dashrender said:

    @coliver said:

    @Dashrender said:

    Folders in that list that don't have the double folder within a folder icon are not OUs. As such GPO's don't apply to objects in them.

    learned this the hard way a long time ago.

    I guess I don't understand what this means. What icon are you talking about?

    Missed this.

    0_1455852916491_ou.JPG

    Ok thanks, that makes more sense.

  • 1 Votes
    21 Posts
    5k Views
    kamidonK

    @Dashrender What's weird though is I've gotten them to work here and there despite using msiexec. BUT then I changed some things around (added spaces....spaces make life hell but I wanted to see if I had to use " " around the path or not...utter failure, upon changing things back....I broke the installs)

    But yeah, I'm happy 🙂
    Now I just need to add what I've figured out to the other programs, which a few auto-activate lol, testing will have to be live.
    Ugh then I have to make more deployments for our Autodesk products, which actually isn't that bad..

  • 2 Votes
    11 Posts
    2k Views
    Minion QueenM

    @scottalanmiller said:

    Isn't this a client who was fired? Problem fixed!

    Yup... unfortuneatley the client refused to put in the hardware and required all updates to all machines be shut off, and requested all machines be rolled back from 10 that he hated....