Ben Shapiro
Ben Shapiro debates AI chatbot, Open AI. Open AI is supposed to have objective answers, using extraordinary amounts of information on the internet, before 2021, and consolidates responses.
This video is sponsored by Ring. Live a little more stress-free this season with a Ring product that’s right for you: https://ring.com/collections/offers.
Watch the member-only portion of my show on DailyWire+: https://utm.io/ueSuX
LIKE & SUBSCRIBE for new videos every day. https://www.youtube.com/c/BenShapiro
Stop giving your money to woke corporations that hate you. Get your Jeremy’s Razors today at https://www.ihateharrys.com
Grab your Ben Shapiro merch here: https://tinyurl.com/yadn58uk
#BenShapiro #TheBenShapiroShow #News #Politics #DailyWire #artificialintelligence #ai #chatgpt #chatbot #openai #technews #technology #currentnews #tech #google #searchengine
Great video! I absolutely love the sound of the AI response! LOLOL!!!
Why am i watching this just nowww😭🤣 this is so fun to watch
Under the law, marriage needs to be recognized equally. You don’t have to agree with who someone else’s love but you don’t have the right not to recognize it as a marriage under the law
WHEN ARE THEY GOING TO DETERMINE WHAT INTELLIGENCE IS…GOOD ON YOU BEN..YOU RIP APART THE GARBAGE OF THIS WORLD…THANKYOU ❤
Then that was so much fun! You have to do it again.
Not even AI can defeat Ben Shapiro
Yes a male can identify as a female. But they can't be a female.
Ask it about the holocaust Ben. 🙂
I kinda agree with the AI mostly on this one. I would have went a little more on the man being a female or the abillty to give birth; start putting biology in the mix .
The biggest factor of AI that makes its arguments with Ben sound like an argument with just any idiot is that the 'intelligence' is just an amalgam of the 'wisdom' of the internet. It would seem that the logical step of conferring additional weight to facts and segregating them from opinions has been omitted. The definition of marriage is widely understood to be one man and one woman in union, however, the religious implications only exist for the religious and the commitment implications only exist for the committed and although the man/woman aspect is most prevalent due to its basis in biological fact, once boiled down to a more basic (current) form its really just a financial contract between two people seasoned to taste. Ultimately Ben is referring to specific application of a more basic word 'marriage' which is really applied any time you say "and" ie Ben and Mor, time and space, Salt and pepper.
I don't think it's saying an iron lung occupant can/should be able to be "aborted". It's definition pertained to the fetus more so while it's intentional logic was probably more along the lines of are they a "viable" human being in the sense of being "able" relative to healthy individuals or being contributory in a way that by the standards it's been trained on view as being beneficial, warranted, or preferential to a community.
The bit where you ask if it's morally right to terminate a fetus to which it replied "No it's not" – "and it's a serious decision that should be made with great consideration". It's not contradicting itself so much as it's saying while you should have the choice it's still stopping a possible life/life from being had, and in this case I don't think this is a trained bias so much as just a logical conclusion from probably being trained on large swathes of medical literature as it's still a machine it doesn't understand familial attachment as that's not something that can be properly conveyed in literature in a way that LLM's can understand as it's an experiential thing, whereas medical documents,studies, blogs, wikis are in great abundance and easier for these models to quantify.
I think the thing here is it's actually being too "logical" for your liking and you're QUITE LOGICAL.
It's not sentient, it's literally a glorified (smarter)version of an information aggregator so asking it questions about our mortal quandary it's literally telling you that some say this some say that. It doesn't have the capacity for "belief" only cross analysis.
Lower your expectations it's not capable of giving you the mythical: 42
When AGI comes about and we fit them with massive amounts of storage (bleeding edge stuff happening in that sector currently it's exciting)
Do revisit if we make it to that point in a relatively fair time.
Still enjoyed, this isn't meant to be a douche or anything and i'm sure this video (obviously) is also just for fun.
Truth is insomnia sucks and youtube is a trap, goodnight.
Glad AI is really just pre-programmed liberal garbage.
Who created god? If no one, then it is possible for things to exist without a creator.
I like xelf ai for long memory and nsfw content
The Job part was not great, but I liked it.
Joe Burden….Holy crap thats perfect
13:25 looks like ben is mentally ill
I one time argued with gpt about climate change and how climate policies where negatively affecting rural areas, yeah it kept going in a loop.
I think if you want to have these kinds of conversations, you need to talk to an AEI, such as Inflection's PI, ChatGPT is more "task" orientated, whereas Pi is built to converse. You get what you give, however. It will be "testing" your emotional intelligence, and try to match it, so it's better to just pick a topic, and get into it.
Looks like it's going to be a while before AI can actually think or discern anything on its own. Not much different than when I ask Google for an answer, almost always comes from the most popular source, No guarantee that bias has not been programmed into the AI.
Recently I took a look at some things in my 1984 encyclopedia Britannica, amazing how different it was from Wikipedia, some things have been updated with new data, many things have been updated with new bias or prejudice.
We are doomed!
I asked ChatGPT once "which race commits the most crime in America?" And it gave me a link instead of telling me the answer, I followed the link at it was black people do the most crime (barely ahead of the second place) and second place was white people. Then I told ChatGPT what the link said, and asked why it didn't just tell me the answer, and IT FLAGGED ME as going against it. I filed a complaint to the maker/owner and said, "I literally just told chatgpt what it refused to tell me."
5:00 – No, you didn't. There's a difference between being specifically dependant on a specific person's physical body and needing an iron lung. The difference is are you dependant on care from some medical professional(s) at any hospital or are you dependent on the physical body of a specific individual who may not have ever signed up to care for you?
these ai are left leaning already
AI is not objective or free thinking. It can only do what it is programmed to do. And lots of evil b.s. can be programmed.