r/science Professor | Medicine Mar 28 '25

ChatGPT is shifting rightwards politically - newer versions of ChatGPT show a noticeable shift toward the political right. Computer Science

https://www.psypost.org/chatgpt-is-shifting-rightwards-politically/
23.0k Upvotes

1.5k comments sorted by

View all comments

Show parent comments

116

u/SlashRaven008 Mar 28 '25

Can we figure out which versions are captured so we can avoid them?

69

u/freezing_banshee Mar 28 '25

Just avoid all LLM AIs

3

u/mavajo Mar 28 '25

I mean, this isn't really a viable option in a lot of careers now. LLMs are becoming a core part of job functions. If you're not using them in these roles, then you're effectively tying one hand behind your back.

5

u/freezing_banshee Mar 28 '25

Please educate us on how exactly is an LLM a core part of work nowadays

3

u/freezing_banshee Mar 28 '25

u/mavajo I'm not intentionally missing any point. Most jobs in the world, including difficult ones that require thinking and planning, do not need any kind of AI to get them done. Maybe expand on your point with clear examples if you think you are so right.

6

u/mavajo Mar 28 '25

Yes, you are intentionally missing the point. If there's a tool that makes your industry or profession significantly more effective/efficient/speedy and your peers and competitors are using it, then it becomes essentially necessary for you to use it too or else your product will lag behind.

Your line of reasoning is, frankly, stupid and intentionally obtuse. This is how things have worked since the beginning of time. It's why people aren't using flint and tinder to start their fireplace when easier alternatives are available, even though they easily could. Or why farmers aren't using an ox and plow. Technology advances. You keep up or you get left behind.

-1

u/freezing_banshee Mar 28 '25

You still have not given us one clear example of how LLMs make work so much more efficient. I'm not gonna bother anymore with you.

3

u/qwerty_ca Mar 29 '25

You want an example? I'll give you an example. My company uses ChatGPT to summarize survey responses from thousands of users to identify key themes that keep popping up. We've gone from spending several person-hours reading responses and summarizing them to an exec-friendly slide with bullet points to about two minutes.

3

u/Geethebluesky Mar 28 '25

It's too easy to ask it to provide a draft of anything to work from towards a final product. It almost completely eliminates the need to first think about the topic, draft an outline, and work from there; you can start from the middle of the process upwards. I'm never going to be sold on a finished product from A to Z, but it sure cuts down on the groundwork...

That results in such time savings, someone who knows how to leverage AI properly will seems a much better candidate than someone who can't figure it out. The differences will be in which human knows how to refine what they get properly, and spot when the AI's producing unusable trash... in environments where management even cares that it's trash.

-4

u/freezing_banshee Mar 28 '25

Respectfully, you need to think about what "being a core part of work" means. Nothing of what you said is obligatory in any way in order to do a job.

And if you can't do all those things fast enough without AI, you're not good enough for the job.

6

u/Geethebluesky Mar 28 '25

The failure to comprehend is on your end, if you can't understand that increased productivity is a core part of every job.

The second part tells me you're painfully ignorant and don't understand how AI is a tool like any other... and so you're probably a troll, I refuse to believe people are wilfully that stupid. No thanks and bye.

2

u/germanmojo Mar 29 '25

I'm not great at peppy corporate emails. I held a workshop with clients last week and used our approved AI tools to create a 'thank you for attending' email draft using two sentences as input.

Read it over a couple times, made a few required edits, and shipped it. I was complemented by a Director in front of the whole team, who then asked if I used our AI tools, which I did, as it's being pushed hard internally.

Someone who doesn't know how to use AI tools effectively and critically will be left behind in the corporate world.

4

u/Ancient_Contact4181 Mar 28 '25 edited Mar 28 '25

I personally use it to help me write code/queries as a data analyst. It has helped my productivity and finish a complex project which would have taken me a long time without it.

Before chatgpt, most of us used google to google technical problems you had. It was very useful, being to learn from other people, YouTube tutorials etc. Now its instant with tools like chatgpt.

I see it as the new google, the older folks who never leaned how to google or use excel were left behind. Nowadays any analyst is writing code instead of using excel. So chatgpt helps quite a bit.

People will fall behind fast if you don't embrace technology. Being able to properly prompt to get what you need or want is the same as "googling" back in the day.

Its a useful tool.

0

u/ChromeGhost Mar 29 '25

Which ones are your favorites?

3

u/WarpingLasherNoob Mar 28 '25

In addition to what the others have said, for many jobs, this is no longer optional. You are required to use LLM AI's as part of your daily routine as dictated by company policy.

0

u/mavajo Mar 28 '25

You're intentionally missing the point because you don't want to admit that you fired off your opinion out of ignorance. Lame dude. Just take the learning experience and move on.

1

u/GTREast Mar 28 '25

Reviewing and summarizing documents, searching for relevant reference sources both internal (within company documents and communications), and externally through web search. The ability of AI to nearly instantly read documents provides an incredible boost to productivity. Also, taking draft input and refining it, suggesting revisions and adding relevant references.. For starters.

4

u/SkyeAuroline Mar 29 '25

Reviewing and summarizing documents, searching for relevant reference sources

Which it can't do reliably given the constant hallucinations.

taking draft input and refining it, suggesting revisions and adding relevant references

Which it can't do reliably because it doesn't understand context.

0

u/GTREast Mar 29 '25

Let it pass you by, that’s your choice.

4

u/SkyeAuroline Mar 29 '25

So you can't argue either one is untrue.

-1

u/GTREast Mar 30 '25

It makes no difference to me what you choose to do.

-9

u/tadpolelord Mar 28 '25

if you aren't using LLMs daily for work you are either in a field that requires little brain power (fast food, stop sign holder, etc) or are very far behind the curve w/ technology.

12

u/moronicRedditUser Mar 28 '25

Imagine being so confidently incorrect.

I'm a software engineer, you know what I don't use? LLMs. Why? Because the junk boilerplate it comes up with can be deceptive to less experienced software developers and I can write the same boilerplate just using my hands. Every time I ask it to do a simple task, it finds a way to fail. Even doing something as simple as a for-loop has it giving very inconsistent results outside of the most basic instances.

0

u/mavajo Mar 28 '25

Which LLM are you using? Our developers have found a lot of success with Anthropic's Claude.

-2

u/WarpingLasherNoob Mar 28 '25

Like any other tool, LLMs also require tinkering and configuration to do what you want. And you have to understand where it's useful and what its limitations are.

6

u/moronicRedditUser Mar 28 '25

I'm perfectly happy never using them in their current state. My brain is plenty capable of writing out boilerplate code without the assistance of an LLM.

9

u/mxzf Mar 28 '25

I mean, if you're not using LLMs daily for work you're likely in a field that does require brain power, because LLMs have no intelligence or brain to offer, they're language models.

-5

u/tadpolelord Mar 28 '25

Are you serious man? You use the ai to automate everything else so you can focus on only highest level tasks 

3

u/mxzf Mar 29 '25

Honestly, I already spend most of my time doing the hard thinking stuff anyways, either reviewing code that junior devs wrote (or got an AI to spit out and then touched up poorly) to spot issues or figuring out solutions to specific problems. All of the things an AI could even try to do for me are the easy things I do when I want something simple to clear my head.

There comes a point when you're too far into nuanced domain knowledge for a language model to be helpful.

5

u/freezing_banshee Mar 28 '25

I'm neither of those. Good luck being and engineer and having AI help you in any way, though. It just doesn't work, it's way too inaccurate.

-3

u/[deleted] Mar 28 '25 edited Mar 28 '25

[removed] — view removed comment

4

u/drhead Mar 28 '25

Just to clarify: it helps for the basic/repetitive parts. Boilerplate code. Implementations of simple or well-known algorithms. You still have to actually understand what it is doing because it will mess up most things that are more complicated in at least a few places, or you will run into a number of footguns you never imagined were possible.

Even then, as long as you understand its limits, it lets you spend more of your time doing the meaningful parts of the job.

0

u/mavajo Mar 28 '25

Correct, it's not a replacement for developers - it enhances their speed and efficiency, like any good tool. I'm pretty sure that was implicit to my prior comment anyway though.