What's new

How Google Got Smoked by OpenAI's ChatGPT [even though its LaMDA chatbot is better]

Hamartia Antidote

ELITE MEMBER
Joined
Nov 17, 2013
Messages
35,188
Reaction score
30
Country
United States
Location
United States

The most embarrassing part is that the search giant has a chatbot that’s better.​


Google’s had an awkward week. After years of preaching that conversational search was its future, it’s stood by as the world discovered ChatGPT.

The powerful chatbot from OpenAI takes queries—some meant for the search bar—and answers with astonishing conversational replies. It’s shared recipes, reviewed code, and argued politics so adeptly that screenshots of its answers now fill social media. This was the future Google promised. But not with someone else fulfilling it.

How Google missed this moment is not a simple matter of a blind spot. It’s a case of an incumbent being so careful about its business, reputation, and customer relationships that it refused to release similar, more powerful tech. And it’s far from the end of the story.

“Google thinks a lot about how something can damage its reputation,” said Gaurav Nemade, a former Google product manager who was the first person who helmed its LaMDA chatbot project. “They lean on the side of conservatism.”

Google’s LaMDA—made famous, if you would call it that, when engineer and tester Blake Lemoine called it sentient—is a more capable bot than ChatGPT, yet the company’s been hesitant to make it public. For Google, the problem with chatbots is they’re wrong a lot, yet present their answers with undeserved confidence. Leading people astray—with assuredness—is less than ideal for a company built on helping you find the right answers. So LaMDA remains in research mode.

Even if chatbots were to fix their accuracy issues, Google would still have a business-model problem to contend with. The company makes money when people click ads next to search results, and it’s awkward to fit ads into conversational replies. Imagine receiving a response and then immediately getting pitched to go somewhere else—it feels slimy and unhelpful. Google thus has little incentive to move us beyond traditional search, at least not in a paradigm-shifting way, until it figures out how to make the money aspect work. In the meantime, it’ll stick with the less impressive Google Assistant.

“There’s a reason why Clayton Christensen wrote The Innovator’s Dilemma. It’s a real dilemma,” Box CEO Aaron Levie told me on the Big Technology Podcast this week. “Google doesn’t inherently want you … to just get the answer to every problem. Because that might reduce the need to go click around the web, which would then reduce the need for us to go to Google.”

But Google’s reasons to keep LaMDA private fade a bit with a competitor emerging. A sophisticated, public chatbot like ChatGPT makes waiting for the perfect business model risky. Delay long enough, and you could cede the market. ChatGPT will also take criticism as it gains adoption, sustaining hits that otherwise would’ve been Google’s. And ChatGPT’s shortcomings will teach people to view its certainty with skepticism, clearing the way for a risk-averse Google to release its own version.

For now, ChatGPT’s threat to Google remains partially hypothetical. The bot doesn’t access the internet, knows nothing beyond 2021 (or at least, so it says), and has no ads. So while it may take some traditional queries away from Google, it won’t push the $1.2 trillion company to the brink. At least as presently constituted.

But things could change in a hurry. Should OpenAI connect ChatGPT to the internet, it could push Google to bring its own product to market, and its vision for the future along with it. And once Google gets involved, those who’ve seen its chatbot technology expect it to win.

“If ChatGPT or some other product ever became a real threat,” said Lemoine, “they’d just bite the bullet and release LaMDA, which would smoke ChatGPT.”




PDF ChatGTP thread,,,
 
Last edited:
.
I just had a 2 min dialogue with the openAI chat. It seems very information short and focused almost entirely on grammatical language delivery. Every 2nd answer is a canned "I am a large language model trained in ....". It does not have access to internet information and answers are entirely limited to training datasets. At best it can be used at the rendition layer.
 
.
I just had a 2 min dialogue with the openAI chat. It seems very information short and focused almost entirely on grammatical language delivery. Every 2nd answer is a canned "I am a large language model trained in ....". It does not have access to internet information and answers are entirely limited to training datasets. At best it can be used at the rendition layer.
Because you're asking it the wrong questions and giving it the wrong missions.

It's pretty obvious AIs aren't connected to the internet and are entirely limited to their training datasets, if you connect them to the internet you will have a ton of wrong information outputted.
 
.
Because you're asking it the wrong questions and giving it the wrong missions.

It's pretty obvious AIs aren't connected to the internet and are entirely limited to their training datasets, if you connect them to the internet you will have a ton of wrong information outputted.
Nothing to disagree there. My point was a conversational entity is a product of conversation linguistics, context, analysis and synthesis. Of these the openAI chat I just had did the 1st part well, 2nd part poorly, 3rd and 4th part in a very canned manner. But may be what they set out to do was different than my definition.
 
.
I just had a 2 min dialogue with the openAI chat. It seems very information short and focused almost entirely on grammatical language delivery. Every 2nd answer is a canned "I am a large language model trained in ....". It does not have access to internet information and answers are entirely limited to training datasets. At best it can be used at the rendition layer.
The creators had deep discussion on that and decided that this is the best answer to "send bobs and vegana" queries.

@SIPRA
 
.
I just had a 2 min dialogue with the openAI chat. It seems very information short and focused almost entirely on grammatical language delivery. Every 2nd answer is a canned "I am a large language model trained in ....". It does not have access to internet information and answers are entirely limited to training datasets. At best it can be used at the rendition layer.

I think if you know its limitations and understand how it works it can be pretty cool. It can be an interactive Google when it come to information or a very good storyteller if you ask it to generate something.

I have a thread on it

Here I have a conversation about "suburbanization" and I gave it an A-

Here I tell it to generate a story about a Zombie apocalypse attack during a Wimbledon tennis match. I'd give it an A...especially after I tell it to "add more gore"!
 
. . .
@Hamartia Antidote bro can you actually do that and post the results here? just to help out our Indian friend
that'll be so dope lmao

Hey I already got two warnings from ChatGPT over gore and death decisions so I'm not going for #3. Sticking with the "G" rated stuff.

1600px-MPA_greenband_intro_card_%28Toy_Story_4_greenband_trailer%29.png
 
Last edited:
. .
Describe in 8 paragraphs how modern amenities changed in US homes from the 1950's, 1960's, 1970's, 1980's,1990's,2000's,2010's,and 2020's.

In the 1950s, modern amenities in US homes included electric appliances such as refrigerators, ovens, and washing machines. Many homes also had television sets, which were becoming increasingly popular. In the 1960s, air conditioning became more common in homes, particularly in warmer regions of the country. Additionally, the use of dishwashers and clothes dryers became more widespread.
In the 1970s, the use of microwave ovens became more popular in US homes, allowing for faster cooking and reheating of food. Energy-efficient appliances also began to become more common during this decade.
In the 1980s, personal computers became more affordable and began to be used in many homes for personal and professional purposes. Additionally, home security systems, such as alarms and surveillance cameras, became more common.
In the 1990s, the internet became widely available in many homes, providing access to a vast amount of information and allowing for online communication. Home entertainment systems, including video game consoles and larger televisions, also became more popular.
In the 2000s, the use of smart home technologies, such as programmable thermostats and remote home monitoring systems, became more common. These technologies allowed homeowners to more easily control and monitor their homes while away.
In the 2010s, the popularity of smart speakers, such as Amazon's Alexa and Google Home, increased as they became more affordable and user-friendly. These devices allowed for hands-free control of various smart home technologies.
In the 2020s, the use of renewable energy sources, such as solar panels, has become more common in US homes. Additionally, the use of electric vehicles has increased, leading to the development of home charging stations.
Overall, the modern amenities in US homes have evolved to include a wider range of appliances and technologies that provide convenience, energy efficiency, and security.
 
.
Describe in 5 paragraphs when did natural gas lighting become popular in the US and when was it replaced with electrical lighting?


Natural gas lighting became popular in the United States in the early 19th century, as a cheaper and more efficient alternative to traditional forms of lighting such as candles and oil lamps. The first natural gas well was drilled in Fredonia, New York in 1821, and by the mid-1800s, natural gas was being used for lighting in many American cities.

The widespread adoption of natural gas lighting was due in part to the efforts of entrepreneurs and inventors who developed new technologies for extracting and distributing natural gas. For example, in 1825, the American inventor Samuel Morey developed a method for refining natural gas, which made it easier to transport and use. In the 1850s, the American inventor Philip H. Grosser invented a gas meter that could accurately measure the amount of natural gas being used, enabling customers to pay for their gas usage on a per-unit basis.

Natural gas lighting was popular in both residential and commercial settings. In homes, natural gas lamps were used for illumination, while in businesses and public buildings, natural gas was used to light chandeliers and other large fixtures. Natural gas lighting was also used to light streets and other public areas, making it an important part of urban infrastructure.

However, by the late 1800s, natural gas lighting was beginning to be replaced by electrical lighting. This was due to the development of new technologies that made it possible to generate and distribute electricity on a large scale. In 1879, the American inventor Thomas Edison developed the first practical incandescent light bulb, which used electricity to generate light. This was followed by the development of the first electric power plants, which made it possible to supply electricity to large numbers of customers.

As a result, natural gas lighting gradually fell out of favor and was replaced by electrical lighting. By the early 1900s, most American cities had switched to electrical lighting, and natural gas lighting was largely relegated to rural areas and other places where electricity was not yet available. Today, natural gas is still used for lighting in some parts of the world, but it has been largely replaced by electrical lighting.
 
.
Describe in 6 paragraphs how heating US homes has evolved in the last 200 years in terms of wood, coal, natural gas, oil, and electricity. What are most homes in the US heated with today?

In the early 19th century, most homes in the United States were heated with wood. This was the most readily available and affordable source of fuel, and people would cut down trees and use the wood to heat their homes. As the country became more industrialized, however, wood became less practical as a heating source.

In the mid-19th century, coal began to be used more widely as a heating source. It was more efficient than wood, and the development of the railroad made it easier to transport. However, coal was still a dirty and labor-intensive fuel, and it was not without its drawbacks.

In the early 20th century, natural gas became a popular heating option. It was cleaner and more efficient than coal, and it could be easily transported through pipelines. Natural gas remains a popular heating source today, especially in urban areas.

In the latter half of the 20th century, oil became a popular choice for heating homes. It was also cleaner and more efficient than coal, and it could be easily transported by tanker truck. However, the rising cost of oil made it less attractive as a heating option, and many people switched back to natural gas or electricity.

Electricity has also become a popular heating option in recent years. Electric heat pumps are efficient and easy to install, and they can be used to heat and cool homes. However, the high cost of electricity can make it a expensive option, especially in areas with high energy prices.

Today, most homes in the United States are heated with natural gas or electricity. Some homes are still heated with oil, and a small percentage are heated with wood or coal. The type of heating source used often depends on the availability and cost of the fuel in a particular region.
 
.
AI never will be near to a human brains because AI can't have free will.

It can't even create random numbers with a computer, less create something like human brain.

Determinism and free will is the key in the difference between humans and every other thing.

It's too the key of how humans we perceive the time, time is a constant human illusion.
 
. .

Latest posts

Pakistan Defence Latest Posts

Back
Top Bottom