In October, U.S. employers announced 153,074 job cuts, the highest for that month since 2003. While AI was blamed for these layoffs, the primary cause was cost-cutting, with AI contributing to about 20% of cuts. Major reductions occurred in tech and warehousing sectors, which over-hired during the pandemic and are now readjusting. The Atlanta Fed’s GDPNow model indicates a projected -4.0% real GDP growth, influenced by speculative AI investments in data centers and chips. Although AI adoption shows promise, it remains shallow, with the Census Bureau indicating only modest in-production usage. Economic theory suggests that broad productivity gains from AI will be limited unless task transformations are widespread. While certain micro-evidence demonstrates productivity boosts from AI, significant economy-wide transformation is still lacking. As the economy slows and margins tighten, layoffs will likely continue. Thus, the bulk of job cuts result from economic conditions rather than AI advancements.
Source link
Share
Read more