• 3 Posts
  • 12 Comments
Joined 1 year ago
cake
Cake day: June 2nd, 2023

help-circle





  • This is much easier said than done. Around large parts of the United States you can’t reliably commute by public transit. For me personally, without a car, a one way 40 mile trip to the major city near me would take 5 hours. That’s 2 different trains and 2 different busses.

    Add that to the fact that the station closest to me only has a few trains a day and my options are very limited.

    Even if we ignore the current train schedule and assume that trains come by every 5 min, it would still be a 2 hour trip that costs me $20 for one way. I could then bike the rest of the way and avoid the last 2 buses.

    There are rail passes I could get, but those would cost $477/month. It’s cheaper to lease a Tesla at that point.

    Owning a car is pretty much the only reasonable way of getting around for many parts of the U.S.








  • Yeah, another use that I know I’ll be using it for (or at least Bing’s Chat) will be summarizing large documents (especially in the sense of becoming a more informed voter).

    I don’t have time to read through the thousands of pages of legalese that our lawmakers come up with. But instead of having to wait or only rely on summaries from others I can run it through an AI to give summaries of each section and then read into anything that piques my interest.

    It might be interesting to even train a smaller LLM that does this more efficiently.

    The next step would be a LLM that pays more attention to unintended consequences of laws due to the way they’re written. But for something really effective I imagine that would require the assistance of a large number of experts in the field… And/Or a lot of research on laws being overturned, loopholes fixed, etc.

    Even then it’s important that we understand that these tools are far from perfect. And we should question results rather than accepting them at face value.


  • Glad someone mentioned the lawyer that screwed up by including ChatGPT’s fake cited sources. It will be interesting to see what comes from this.

    Additionally not a lot of people realize that they’ve signed an indemnification clause when using ChatGPT (or what that means).

    Basically OpenAI can send you the legal bills for any lawsuits that come from your use of ChatGPT. So if you “jailbroke” ChatGPT and posted an image of it telling you the recipe to something illegal. OpenAI could end up with a lawsuit on their hands and they would bill that user for all of the legal fees incurred.

    Possibly the first case of this we’ll see will be related to the defamation case that a certain Mayor from Australia could have against OpenAI. https://gizmodo.com/openai-defamation-chatbot-brian-hood-chatgpt-1850302595

    Even if OpenAI wins the lawsuit they will most likely bill the user who posted the image of ChatGPT defaming the mayor.