ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    MS Teen Girl AI Goes Horribly Wrong

    IT Discussion
    10
    14
    1.4k
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • M
      mlnews
      last edited by

      http://www.telegraph.co.uk/technology/2016/03/24/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit/

      1 Reply Last reply Reply Quote 0
      • T
        thanksajdotcom
        last edited by

        Yeah, this was pretty bad...

        1 Reply Last reply Reply Quote 0
        • G
          gjacobse
          last edited by

          Yea,.. that went south quick.

          1 Reply Last reply Reply Quote 0
          • D
            DustinB3403
            last edited by DustinB3403

            @thanksajdotcom said:

            Yeah, this was pretty bad ass hysterical...

            FTFY, the bot went this route in under 24 hours because there were no controls in place.

            (I saw this a few days ago and was laughing pretty hard at the story)

            1 Reply Last reply Reply Quote 1
            • I
              IRJ
              last edited by

              Artificial intelligence completely learned through tweets by anyone on the internet. What could possibly go wrong?

              1 Reply Last reply Reply Quote 3
              • J
                JaredBusch
                last edited by

                That is just hilarious. Also sad truth about the state of online streams of thought.

                S 1 Reply Last reply Reply Quote 1
                • S
                  scottalanmiller @JaredBusch
                  last edited by

                  @JaredBusch said:

                  That is just hilarious. Also sad truth about the state of online streams of thought.

                  That's what I thought. It's really funny, but also sad. And an interesting insight into what online posting is like.

                  1 Reply Last reply Reply Quote 0
                  • B
                    bbigford
                    last edited by bbigford

                    No safety net or QA. Classic Microsoft.

                    S 1 Reply Last reply Reply Quote 0
                    • S
                      scottalanmiller @bbigford
                      last edited by

                      @BBigford said:

                      No safety net or QA. Classic Microsoft.

                      Well, raw and uncensored. They tried something daring and got... something daring.

                      B 1 Reply Last reply Reply Quote 1
                      • S
                        scottalanmiller
                        last edited by

                        I don't fault MS here, it was an interesting experiment but...

                        D 1 Reply Last reply Reply Quote 0
                        • D
                          dafyre @scottalanmiller
                          last edited by

                          @scottalanmiller said:

                          I don't fault MS here, it was an interesting experiment but...

                          That's just at.... It was an experiement that ended.... badly. Now they can go back and try again.

                          1 Reply Last reply Reply Quote 0
                          • S
                            scottalanmiller
                            last edited by

                            This falls under the development concept of "fail quickly."

                            1 Reply Last reply Reply Quote 1
                            • B
                              bbigford @scottalanmiller
                              last edited by

                              @scottalanmiller said:

                              @BBigford said:

                              No safety net or QA. Classic Microsoft.

                              Well, raw and uncensored. They tried something daring and got... something daring.

                              I was seriously blown away. Haha sitting around that discussion table. I get the feeling someone said "it mines the community's input, then automates the output in the form of a tweet... What could possibly go wrong?" The first mistake was under estimating people of the Internet.

                              B 1 Reply Last reply Reply Quote 2
                              • B
                                BRRABill @bbigford
                                last edited by

                                @BBigford said:

                                I was seriously blown away. Haha sitting around that discussion table. I get the feeling someone said "it mines the community's input, then automates the output in the form of a tweet... What could possibly go wrong?" The first mistake was under estimating people of the Internet.

                                FTFY

                                1 Reply Last reply Reply Quote 3
                                • 1 / 1
                                • First post
                                  Last post