ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    MS Teen Girl AI Goes Horribly Wrong

    Scheduled Pinned Locked Moved IT Discussion
    14 Posts 10 Posters 1.5k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • gjacobseG
      gjacobse
      last edited by

      Yea,.. that went south quick.

      1 Reply Last reply Reply Quote 0
      • DustinB3403D
        DustinB3403
        last edited by DustinB3403

        @thanksajdotcom said:

        Yeah, this was pretty bad ass hysterical...

        FTFY, the bot went this route in under 24 hours because there were no controls in place.

        (I saw this a few days ago and was laughing pretty hard at the story)

        1 Reply Last reply Reply Quote 1
        • IRJI
          IRJ
          last edited by

          Artificial intelligence completely learned through tweets by anyone on the internet. What could possibly go wrong?

          1 Reply Last reply Reply Quote 3
          • JaredBuschJ
            JaredBusch
            last edited by

            That is just hilarious. Also sad truth about the state of online streams of thought.

            scottalanmillerS 1 Reply Last reply Reply Quote 1
            • scottalanmillerS
              scottalanmiller @JaredBusch
              last edited by

              @JaredBusch said:

              That is just hilarious. Also sad truth about the state of online streams of thought.

              That's what I thought. It's really funny, but also sad. And an interesting insight into what online posting is like.

              1 Reply Last reply Reply Quote 0
              • bbigfordB
                bbigford
                last edited by bbigford

                No safety net or QA. Classic Microsoft.

                scottalanmillerS 1 Reply Last reply Reply Quote 0
                • scottalanmillerS
                  scottalanmiller @bbigford
                  last edited by

                  @BBigford said:

                  No safety net or QA. Classic Microsoft.

                  Well, raw and uncensored. They tried something daring and got... something daring.

                  bbigfordB 1 Reply Last reply Reply Quote 1
                  • scottalanmillerS
                    scottalanmiller
                    last edited by

                    I don't fault MS here, it was an interesting experiment but...

                    dafyreD 1 Reply Last reply Reply Quote 0
                    • dafyreD
                      dafyre @scottalanmiller
                      last edited by

                      @scottalanmiller said:

                      I don't fault MS here, it was an interesting experiment but...

                      That's just at.... It was an experiement that ended.... badly. Now they can go back and try again.

                      1 Reply Last reply Reply Quote 0
                      • scottalanmillerS
                        scottalanmiller
                        last edited by

                        This falls under the development concept of "fail quickly."

                        1 Reply Last reply Reply Quote 1
                        • bbigfordB
                          bbigford @scottalanmiller
                          last edited by

                          @scottalanmiller said:

                          @BBigford said:

                          No safety net or QA. Classic Microsoft.

                          Well, raw and uncensored. They tried something daring and got... something daring.

                          I was seriously blown away. Haha sitting around that discussion table. I get the feeling someone said "it mines the community's input, then automates the output in the form of a tweet... What could possibly go wrong?" The first mistake was under estimating people of the Internet.

                          BRRABillB 1 Reply Last reply Reply Quote 2
                          • BRRABillB
                            BRRABill @bbigford
                            last edited by

                            @BBigford said:

                            I was seriously blown away. Haha sitting around that discussion table. I get the feeling someone said "it mines the community's input, then automates the output in the form of a tweet... What could possibly go wrong?" The first mistake was under estimating people of the Internet.

                            FTFY

                            1 Reply Last reply Reply Quote 3
                            • 1 / 1
                            • First post
                              Last post