ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    File Parsing Magic

    IT Discussion
    8
    23
    3.4k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • scottalanmillerS
      scottalanmiller @anthonyh
      last edited by

      @anthonyh said in File Parsing Magic:

      @scottalanmiller

      Understood. I need to figure out a way to parse the file so that the process finds "user=" and pulls everything after it until it hits the following ";", then finds "ip=" and pulls everything after it until it hits the following ";"

      Yes, which is basically what I did but the cut command can only use a single character delimiter.

      RamblingBipedR 1 Reply Last reply Reply Quote 0
      • RamblingBipedR
        RamblingBiped @scottalanmiller
        last edited by

        @scottalanmiller said in File Parsing Magic:

        @anthonyh said in File Parsing Magic:

        @scottalanmiller

        Understood. I need to figure out a way to parse the file so that the process finds "user=" and pulls everything after it until it hits the following ";", then finds "ip=" and pulls everything after it until it hits the following ";"

        Yes, which is basically what I did but the cut command can only use a single character delimiter.

        Could he pipe it into awk, use the "." as a delimeter and the print all fields preceding each "."?

        1 Reply Last reply Reply Quote 0
        • B
          Brett
          last edited by

          I'm very much a Linux noob, so I don't know what command to use. But I'd just use a regular expression alone or perhaps in combination with some other command to get the desired text here. In Powershell I would use the -match operator and/or the Select-String cmdlet.

          1 Reply Last reply Reply Quote 0
          • 1
          • 2
          • 2 / 2
          • First post
            Last post