• 0 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: August 11th, 2023

help-circle


  • I rushed to just grab that codeblock from Wikipedia. But the selection of which characters are considered emoji is not arbitrary. The Unicode Consortium (their Unicode Emoji Standard and Research Working Group to be exact) publishes those list and guidelines on how they should be rendered. I believe the most recent version of the standard is Emoji 15.1.

    Edit: I realized I’m going off track here by just reacting to comments and forgetting my initial point. The difference I was initially alluding to is in selection criteria. The emoji. for assigning a character a Unicode codepoint is very different from the criteria for creating a new emoji. Bitcoin has a unique symbol and there is a real need to use that symbol in written material. Having a unicode character for it solves that problem, and indeed one was added. The Emoji working group has other selection criteria (which is why you have emoji for eggplant and flying money, and other things that are not otherwise characters. So the fact that a certain character exists, despite its very limited use, has no bearing on whether something else should have an emoji to represent it.


  • There’s no ambiguity. Emoji are characters in the emoticons code block (U+1F600…U+1F64F). Emoji are indeed a subset of characters, but anything outside that block is not an emoji.

    Edit: jumped the gun on that definition, just took the code block from Wikipedia. But there is no ambiguity on which character is an emoji and which is not. The Unicode Consortium publishes lists of emoji and guidelines on how they should be rendered.





  • It’s not lying or hallucinating. It’s describing exactly what it found in search results. There’s an web page with that title from that date. Now the problem is that the web page is pinterest and the title is the result of aggressive SEO. These types of SEO practices are what made Google largely useless for the past several years and an AI that is based on these useless results will be just as useless.




  • Deep learning did not shift any paradigm. It’s just more advanced programming. But gen AI is not intelligence. It’s just really well trained ML. ChatGPT can generate text that looks true and relevant. And that’s its goal. It doesn’t have to be true or relevant, it just has to look convincing. And it does. But there’s no form of intelligence at play there. It’s just advanced ML models taking an input and guessing the most likely output.

    Here’s another interesting article about this debate: https://ourworldindata.org/ai-timelines

    What we have today does not exhibit even the faintest signs of actual intelligence. Gen AI models don’t actually understand the output they are providing, that’s why they so often produce self-contradictory results. And the algorithms will continue to be fine-tuned to produce fewer such mistakes, but that won’t change the core of what gen AI really is. You can’t teach ChatGPT how to play chess or a new language or music. The same model can be trained to do one of those tasks instead of chatting, but that’s not how intelligence works.


  • Any type of content generated by AI should be reviewed and polished by a professional. If you’re putting raw AI output out there directly then you don’t care enough about the quality of your product.

    For example, there are tons of nonsensical articles on the internet that were obviously generated by AI and their sole purpose is to crowd search results and generate traffic. The content writers those replaced were paid $1/article or less (I work in the freelancing business and I know these types of jobs). Not people with any actual training in content writing.

    But besides the tons of prompt crafting and other similar AI support jobs now flooding the market, there’s also huge investment in hiring highly skilled engineers to launch various AI related product while the hype is high.

    So overall a ton of badly paid jobs were lost and a lot of better paid jobs were created.

    The worst part will be when the hype dies and the new trend comes along. Entire AI teams will be laid off to make room for others.


  • See the sources above and many more. We don’t need one or two breakthroughs, we need a complete paradigm shift. We don’t even know where to start with for AGI. There’s a bunch of research, but nothing really came out of it yet. Weak AI has made impressive bounds in the past few years, but the only connection between weak and strong AI is the name. Weak AI will not become strong AI as it continues to evolve. The two are completely separate avenues of research. Weak AI is still advanced algorithms. You can’t get AGI with just code. We’ll need a completely new type of hardware for it.



  • All progress comes with old jobs becoming obsolete and new jobs being created. It’s just natural.

    But AI is not going to replace any skilled professionals soon. It’s a great tool to add to professionals’ arsenal, but non-professionals who use it to completely replace hiring a professional will get what they pay for (and those people would have never actually paid for a skilled professional in the first place; they’d have hired the cheapest outsourced wannabe they could find; after first trying to convince a professional that exposure is worth more than money)