No matter what you ask it, it'll brag about what a great job it's doing answering you, announce that it's having a baby, then tell everybody that it's being let go because there are better AI. It'll thank a few key people who it worked with, and tell you that it's actually thrilled with this opportunity to take a break from answering your question, and will spend more time on its old hobby of being an online resume.
For me the bar for "reputationally safe" is really high because my market (cynical tech CTO's etc. don't respond well to things that sounds like ChatGPT) and so I don't expect to any time soon, but for many others that bar is pretty low as long as it's good enough for LinkedIn's algorithm to give it impressions.
A Nash Equilibrium of automated bullshit, it'll just make everything more miserable, programmatically.
I have seen a number of CVs over the past few months that fall into two eye-rolling categories. First, those that have the same set of skills in the exact same order, and routinely sport identical expressions. Over time I've come to associate them with low-grade content farms. Second, a smaller set of exceptionally polished ones that feel unique and really want me to interview the candidate. These candidates will then utterly bomb in the interview, to the point where I'm often asking myself whose CV it was they had submitted.
Signal-to-noise ratio is tending towards zero.
The converse will be true, but the price of AI will just make poor people have to suffer even more
Just the long march of wealth inequality and it's time sucking capitalism.
A recent issue in the job application realm is AI application bots that will apply to 100's of jobs on your behalf, which is the opposite problem. Seems like both sides are racing to make applications as useless as possible as quickly as possible.
If you don't have a network, good luck in the future.
God, I hope the poor thing never achieves consciousness. It will be like the butter-passing robot from Rick & Morty.
It would be content so unhinged, it would remove the need for management consulting as an industry - companies could simply type their problems in chat and do the exact opposite of what Evil LinkedIn Neuro suggests!
(for the uninitiated: https://www.youtube.com/@Neurosama & https://en.wikipedia.org/wiki/Neuro-sama)
I learned this while considering watching a video from MIT. Accordingly, I’m adding “AI Training Coordinator — MIT Inspired” to my skills.
On linkedin your next resume will have the impact of mount rushmore, sitting lincoln or a soviet era workers monument. (The thinker will be censored out because of nudity) :)
During this brilliant interaction I managed to learn a lot about improving my leadership capabilities, and teamwork.
If you are also looking to humbly increase your leadership potential, seek out #CoastalCoder #Mindfulness #Leadership or contact us for your #marketing needs
If it was just a bunch of linked profiles with a job matching function, it would still be LinkedIn.
But of course, you can't work at a place that does something that mundane without suggesting something that makes you look like Facebook or Twitter. You have to at least give people some sort of reason to see what their old colleagues are up to.
Nobody really wants to read the LinkedIn feed, so it's perfectly acceptable that it gets flooded with AI generated content. In effect, the content on LinkedIn is that picture of a happy family on your insurance brochure. You can't not have a photo of something on that kind of marketing document, and you can't be a social network without some sort of doom-scrollable content.
This is just a cheap way to generate some wallpaper.
Zooming out, I bet a lot of the economy is like this. In LI’s case literally some of the smartest people, people with PhDs who were maybe even born in another country, thinking for 40 hours a week about how to rank one piece of meaningless drivel above another one. This is instead of solving real, tangible problems that everyone can see. Ok maybe those people will pay taxes and end up contributing to something like education because they have to, but it’s a pretty inefficient way of making the world better.
I suspect there are a lot of these sorts of investments in the big players, a bunch of teams doing far-from-core things that someone thought was worthwhile.
A 2021 study empirically tested several of Graeber's claims, such as that bullshit jobs were increasing over time and that they accounted for much of the workforce. Using data from the EU-conducted European Working Conditions Survey, the study found that a low and declining proportion of employees considered their jobs to be "rarely" or "never" useful.
Many jobs that appear bad are actually needed, often only because of regulatory requirements or because their importance is misunderstood.These people are self-reporting this. Quite frankly with the number of people in bullshit jobs who think they're doing work I wouldn't really put a lot of value in those types of self-reports.
I’ve recently remedied this to a degree by lowering salary expectations and looking in fields with a more scientific and practical basis in the products and outputs. Unfortunately I’m not a scientist, only a programmer, so my utility is seriously limited and finding work is quite a bit harder than if I were to stick within the SV startup scene.
We keep saying we need UBI but at the same time "we don't have enough homes". Then instead of UBI, maybe people should "make homes"? (That's just one example - there are also jobs in food, healthcare, mental illness care, spacecraft, etc...
I mean, not nobody. I follow a lot of people that post very thoughtful things that spark discussion, and it's one of the only places I know of other than here where I can discuss topics related to my career or field with peers, and for me that's useful.
That or LinkedIn should at least be compelled to ask explicit permission for model training. None of this Darth Vader stuff where they "altered the deal".
Little point. It'll be like facebook's opt-out and only cover things you post/update going forward. Everything you've already posted has already been slurped into the training set and won't be taken out and the model(s) retrained.
The only way to show disapproval in this sort of behaviour that they'll feel is to stop using services that use auto-opt-in for anything, and not enough people are likely to do that for it to be effective.
from the commercial/influencer side, many have taken the AI route already by using LLMs to help write or spice up their posts. even for paid users, the site allows to help you write your bio or certain types of pieces for the past few quarters.
maybe the posts of the yesteryear and like the comments section seems like a "valuable" source for them really. although it would be a bit more scary if this is for video and photos too, although besides the headshots it has also been a lot of AI content in the tech space lately.
It’s a low enough bar that I think AI content will fit right in.
I’m curious if LI has scraped data before giving people the opportunity to disable the feature.
Tell stories how “a man walked and saved universe” and end every sentence with “agree?”?
I don't mean fake users (although I wouldn't put corporate greed beyond trying to fake users). It could be sold as a helpful feature, like summaries of workplace happenings, news, world events, or discussions on the platform in the feeds. Of course, they would need to be filtered for ethical alignment with the social media company, as well as community safety, naturally... Certain political opinions may be less safe than others, and so on...
It is already happening with engagement farming users, so a platform doing it to make itself look more active is not a stretch at all. Reddit did that sort of astroturfing the old fashioned way back when it was starting up, so there is at least one well documented precedent already.
Despite the UK still having the data protection act.
> LinkedIn seems to have auto enrolled folks in the US, but hearing from folks in the EU that they are not seeing this listed in their settings (likely due to privacy regulations).
Honestly, GDPR looks like a godsend! It came just at the right time!
An actual article: https://techcrunch.com/2024/09/18/linkedin-scraped-user-data...