‘AI’ fluffers sure do love the taste of grift flavoured tokens.
I’d ask what you were thinking, but it’s clear that played no material role in this extrusion. Extrapolating the assertion I constrained to a specific topic to the entirety of ‘tech’ is a bellowing straw man.
Further, the exclusively US centric examples of inappropriate stewards reveals the vantage to be squarely rooted inside that noxious bubble. The invocation of treason further betrays an affinity for national subservience.
To refine my original point, my observation of the application of LLMs is that the only entities who find them impressive are those who expressly lack proven expertise in the area it’s being applied. The correlation appears to be nearly linearly, inversely proportional.
LLMs could eventually prove innately useful, but there’s no indication they’re close to that, let alone traversing a relevant vector.
Personally, any world populated with entities who are impressed by LLMs is a world not worth living in.
‘AI’ fluffers sure do love the taste of grift flavoured tokens.
I’d ask what you were thinking, but it’s clear that played no material role in this extrusion. Extrapolating the assertion I constrained to a specific topic to the entirety of ‘tech’ is a bellowing straw man.
Further, the exclusively US centric examples of inappropriate stewards reveals the vantage to be squarely rooted inside that noxious bubble. The invocation of treason further betrays an affinity for national subservience.
To refine my original point, my observation of the application of LLMs is that the only entities who find them impressive are those who expressly lack proven expertise in the area it’s being applied. The correlation appears to be nearly linearly, inversely proportional.
LLMs could eventually prove innately useful, but there’s no indication they’re close to that, let alone traversing a relevant vector.
Personally, any world populated with entities who are impressed by LLMs is a world not worth living in.