There's a lot of debate on the threat LLMs pose to human jobs. I'm extremely skeptical of that and find it much more likely that LLMs will result in more jobs.
That doesn't mean nothing is going to change though. LLMs are incredibly useful and many people are using them over other tools. The problem for these tools is that their original purpose was to automate some human tasks, but would require a significant amount of human input in order to do so. LLMs can likely automate those same tasks. From what I've seen, LLMs do produce lower quality output for those tasks. However, LLMs require significantly less human input making the ROI on LLM use much higher.
Let's look at some examples. The first is UI/UX design tools such as Figma. Figma itself will likely be fine because they are trying to figure out how to use AI, but a design tool without AI is already obsolete. The primary reason is the traditional process of design where you start with wire-frames, then mocks, then design prototypes, then pass the design to engineering. This process only really worked when everyone involved had spent a good amount of their career in building software. This includes product managers, engineers, UX designers, QA, founders, etc.
The process breaks down once you involve people outside that group... which is required for the vast majority of B2B software. Being able to look at a wire-frame or a design prototype and imagine how the finished software application would work is a skill. Most of us take that skill for granted, but it is a hard earned skill nonetheless. Without this skill, people are not really capable of providing constructive feedback. They'll gravitate towards the extremes of "everything looks good!" to "why don't any of these buttons actually work?" Without this skill, people can only provide good feedback when working software is put in front of them. The UI/UX of B2B software tends to be more frustrating than B2C software not because B2B software is more complicated, but because it often has to be built on more assumptions due to the lack of good feedback.
LLMs will change this dynamic. Mocks are primarily used because UI/UX tools make it a lot cheaper to build them than actual working software. Being able to tell an LLM to build a prototype and iterate on that prototype with natural conversation makes getting working software cheaper than building mocks. We now have the ability to get better feedback from users/customers/stakeholders while spending less time building something they can see and use. This process does not replace UX designers. Far from it. But it does change what tools are available to them. If they have the option to create a working prototype in the same time it would normally take to generate a wire-frame, why wouldn't they?
The second tool that's threatened is project management software, e.g. JIRA, Trello, Asana, etc. The problem with project management that is nearly universal is the various balancing acts that need to be performed. A process should provide accurate tracking and reporting, but not be overly burdensome on the staff where the process significantly affects the timeline. The task tracking should be detailed enough to be useful, but not include so much detail that it becomes difficult to see what is happening given the noise. Those details also need to be maintained so that the tasks don't get stale when plans change.
All of that takes a lot of effort to get into place. It is often bespoke to the type of project being done (e.g. R&D projects have to be run differently than production projects) as well as the team you have on hand (e.g. some developers swear by paired programming and some get incredibly stressed out by it). Even if you do have a good process nailed down, it likely uses up at least a couple of hours a week for every individual contributor. Then there's management overhead on top of that. I once worked at a company that had weekly meetings on *how* to improve the JIRA usage. We didn't plan on these meetings being perpetual, but we had so many of them that we started scheduling meetings about how to make those meetings more productive.
The process changes significantly with LLMs. We use AI note takers at our firm in all our team meetings. The note taker is pretty good at getting the action points we mentioned and who they were assigned to. If a client calls my business partner, he can ask the note taker to give him a status update from all the meetings in the past month. Is the note taker perfect? Far from it. It often gets the client names wrong. Some of the action points are non-sensical. Yet, it is accurate enough to do 90-95% of what JIRA would do. When you consider no one is spending as much time maintaining tickets, that ROI is hard to turn down.
NOTE: I'm not arguing that ticketing systems are obsolete. The issue is primarily the amount of human labor traditional ticketing systems are designed around.
I think on some of the companies I worked for in the past and I realize how much time we could have saved if we had LLMs to act as our project management tools. Instead of QA being uncertain about some unclear requirements, they can now get ask an LLM what decisions were made in the development meetings. Instead of customer support filing "bugs" from customers for things that work as intended, they can ask an LLM how something is supposed to work without having to read a 200 page requirements document. Onboarding new developers can be easier because they can now get summaries of decisions made before they started instead of being confused by a piece of code.
Another tool that's being threatened are forums and question/answer services like Stack Overflow. I have a lot of mixed feelings and concerns about this one. On the one hand, LLMs have saved me countless hours of crawling through search results and SO questions/answers. On the other hand, LLMs can only do this because it trained on the data from those tools. If those tools don't get traffic because of LLMs, then they may cease to exist. This results in LLMs being stagnant because there is no new data to train on. And no, an LLM is not AGI and it can not come up with it's own questions and answers that are outside the training data. I'm uncertain of what the solution is here, but there needs to be one eventually.
In general though, I'm excited to see how LLMs will eventually change the landscape for software tools. The holy grail for all of them is to solve a problem while minimizing the amount of human effort required. That goal was never met no matter how new or well thought out a tool was. LLMs can meet that goal.