During a meeting not too long ago someone was like “we’re looking into using AI to generate these reports.”
So they have a dude who gets requests to generate reports. It’s not the same reports, but rather custom reports from scattered bits of data throughout a huge database. Ergo you can’t really create a program to extract these reports on demand, a person actually has to sit and piece these together.
Now they want to use LLMs to alleviate him, as this isn’t technically his role.
They scoffed at me when I asked about how important it is that these reports are accurate, but I mean, it’s a valid concern. Best case you get sometimes-hallucinated reports, worst case you get something that wreaks havoc on the database because it just spits out garbage SQL.
I’m very glad that my role doesn’t involve that BS.
During a meeting not too long ago someone was like “we’re looking into using AI to generate these reports.”
So they have a dude who gets requests to generate reports. It’s not the same reports, but rather custom reports from scattered bits of data throughout a huge database. Ergo you can’t really create a program to extract these reports on demand, a person actually has to sit and piece these together.
Now they want to use LLMs to alleviate him, as this isn’t technically his role.
They scoffed at me when I asked about how important it is that these reports are accurate, but I mean, it’s a valid concern. Best case you get sometimes-hallucinated reports, worst case you get something that wreaks havoc on the database because it just spits out garbage SQL.
I’m very glad that my role doesn’t involve that BS.