• @MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    74 hours ago

    IT guy checking in.

    The only time I’ve even seen drive temp sensor alarms is on server raid arrays and other similar hard drives/SSDs… Never in my life have I seen one available on a consumer device, nor have I seen any alarm for and drive temp, go off. Out just doesn’t happen.

    IMO, this is one of those language barriers where people call their computer chassis (and everything in it) the “hard drive”.

    Applying that assumption, their updated statement is: His computer over heated.

    Idk what kind of shit system he’s running on that 60k rows would cause overheating, but ok.

    • @theparadox@lemmy.world
      link
      fedilink
      English
      2
      edit-2
      4 hours ago

      As another IT guy here, it could also be a shitty method of analysis that he got from ChaptGPT. As an amateur coder/script writer, the kinds of code I’ve seen people use from these bots is disturbing. One of my coworkers asked me for help after trying to cobble together something from bots. There were variables declared and never used, variables that were never assigned values but that were used in expressions… it was like it attempted to do that ransom note made from magazine letters but they couldn’t spell coherently.

  • @LillyPip@lemmy.ca
    link
    fedilink
    7324 hours ago

    This cannot be real, wtf. This is cartoon levels of ineptitude.

    Or sabotage by someone heading out? Please let this be resistance sabotage they haven’t noticed yet.

  • @BenLeMan@lemmy.world
    link
    fedilink
    5523 hours ago

    You’re not supposed to place your laptop directly in the lap of your fur suit. Always leave an air gap for ventilation, smh.

  • @nonentity@sh.itjust.works
    link
    fedilink
    3724 hours ago

    Either she knows something novel, where processing data using voice coils is somehow beneficial, or is someone who calls their computer a ‘hard drive’, which summarily negates any legitimacy of technical competence.

    • 74 183.84
      link
      fedilink
      English
      1015 hours ago

      Bro seriosuly fuck off my phone is overheating now. Thanks

  • @RussianBot8453@lemmy.world
    link
    fedilink
    831 day ago

    I’m a data engineer that processes 2 billion row 3000 column datasets every day, and I open shit in Excel with more than 60k rows. What the hell is this chick talking about?

    • @zenpocalypse@lemm.ee
      link
      fedilink
      English
      211 day ago

      Seems like a good excuse to someone who doesn’t know what they’re doing and needs an excuse because why they haven’t completed it yet?

      The whole post is complete bs in multiple ways. So weird.

    • @person420@lemmynsfw.com
      link
      fedilink
      191 day ago

      Some interesting facts about excel I learned the hard way.

      1. It only supports about a million or so rows
      2. It completely screws up numbers if the column is a number and the number is over 15 digits long.

      Not really related to what you said, but I’m still sore about the bad data import that caused me days of work to clean up.

      • @Mniot@programming.dev
        link
        fedilink
        English
        51 day ago

        The row limitation seems, to me, like an actually-good thing. Excel is for data where you might conceivably scroll up and down looking at it and 1M is definitely beyond the ability of a human even to just skim looking for something different.

        An older version of Excel could only handle 64k rows and I had a client who wanted large amounts of data in Excel format. “Oh sorry, it’s a Microsoft limitation,” I was thrilled to say. “I have no choice but to give you a useful summarization of the data instead of 800k rows (each 1000 columns wide) of raw data.”

        • @frezik@midwest.social
          link
          fedilink
          2
          edit-2
          14 hours ago

          Some time ago, I heard a story of CS and Econ professors having lunch together. The Econ professor was excited that Excel was going to release a version that blew out the 64k row limit. The CS professor nearly choked on his lunch.

          Dependence on Excel has definitely caused bad papers to be published in the Econ space, and has had real world consequences. There was a paper years ago that stated that once a country’s debt gets above 120% of GDP, its economy goes into a death spiral. It was passed around as established fact by the sorts of politicians who justify austerity. Problem was, nobody could reproduce the results. Then an Econ undergrad asked the original author for their Excel spreadsheet, and they found a coding error in the formulas. Once corrected, the conclusion disappeared.

  • @GreenKnight23@lemmy.world
    link
    fedilink
    491 day ago

    I smell something, but it’s not overheating electronics.

    I’ve processed over 5 million records on a laptop that’s almost 10 years old. it took two days to get my results.

    there’s no way 60,000 records overheated ANYTHING.

  • @jkercher@programming.dev
    link
    fedilink
    English
    321 day ago

    60k rows of anything will be pulled into the file cache and do very little work on the drive. Possibly none after the first read.

    • @wise_pancake@lemmy.ca
      link
      fedilink
      524 hours ago

      Are you telling me there’s a difference between an inner and a cross join?

      Cross join is obviously faster, I don’t even have to write “on”

  • Psaldorn
    link
    fedilink
    1511 day ago

    From the same group that doesn’t understand joins and thinks nobody uses SQL this is hardly surprising .

    Probably got an LLM running locally and asking it to get data which is then running 10 level deep sub queries to achieve what 2 inner joins would in a fraction of the time.

    • @_stranger_@lemmy.world
      link
      fedilink
      63
      edit-2
      1 day ago

      You’re giving this person a lot of credit. It’s probably all in the same table and this idiot is probably doing something like a for-loop over an integer range (the length of the table) where it pulls the entire table down every iteration of the loop, dumps it to a local file, and then uses plain text search or some really bad regex’s to find the data they’re looking for.

      • @morbidcactus@lemmy.ca
        link
        fedilink
        241 day ago

        Considering that is nearly exactly some of the answers I’ve received during the technical part of interviews for jr data eng, you’re probably not far off.

        Shit I’ve seen solutions done up that look like that, fighting the optimiser every step (amongst other things)

      • @indepndnt@lemmy.world
        link
        fedilink
        101 day ago

        I think you’re still giving them too much credit with the for loop and regex and everything. I’m thinking they exported something to Excel, got 60k rows, then tried to add a lookup formula to them. Since you know, they don’t use SQL. I’ve done ridiculous things like that in Excel, and it can get so busy that it slows down your whole computer, which I can imagine someone could interpret as their “hard drive overheating”.

      • @makingStuffForFun@lemmy.ml
        link
        fedilink
        4
        edit-2
        1 day ago

        I have to admit I still have some legacy code that does that.

        Then I found pandas. Life changed for the better.

        Now I have lots if old code that I’ll update, “one day”.

        However, even my old code, terrible as it is, does not overheat anything, and can process massively larger sets of data than 60,000 rows without any issue except poor efficiency.

  • @zalgotext@sh.itjust.works
    link
    fedilink
    84
    edit-2
    6 hours ago

    my hard drive overheated

    So, this means they either have a local copy on disk of whatever database they’re querying, or they’re dumping a remote db to disk at some point before/during/after their query, right?

    Either way, I have just one question - why?

    Edit: found the thread with a more in-depth explanation elsewhere in the thread: https://xcancel.com/DataRepublican/status/1900593377370087648#m

    So yeah, she’s apparently toting around an external hard drive with a copy of the “multiple terabytes” large US spending database, running queries against it, then dumping the 60k-row result set to CSV for further processing.

    I’m still confused at what point the external drive overheats, even if she is doing all this in a “hot humid” hotel room that she can’t run any fans I guess because her kids were asleep?

    But like, all of that just adds more questions, and doesn’t really answer the first one - why?

      • @spooky2092@lemmy.blahaj.zone
        link
        fedilink
        English
        451 day ago

        Plus, 60k is nothing. One of our customers had a database that was over 3M records before it got some maintenance. No issue with overheating lol

        • @surph_ninja@lemmy.world
          link
          fedilink
          26
          edit-2
          1 day ago

          I run queries throughout the day that can return 8 million+ rows easily. Granted, it takes few minutes to run, but it has never caused a single issue with overheating even on slim pc’s.

          This makes no fucking sense. 60k rows would return in a flash even on shitty hardware. And if it taxes anything, it’s gonna be the ram or cpu- not the hard drive.

          • @T156@lemmy.world
            link
            fedilink
            English
            2
            edit-2
            19 hours ago

            In my experience, the only time that I’ve taxed a drive when doing a database query is either when dumping it, or with SQLite’s vacuum, which copies the whole thing.

            For a pretty simple search like OP seems to be doing, the indices should have taken care of basically all the heavy lifting.

        • I literally work with ~750,000 line exports on the daily on my little Lenovo workbook. It gets a little cranky, especially if I have a few of those big ones open, but I have yet to witness my hard drive melting down over it. I’m not doing anything special, and I have the exact same business-economy tier setup 95% of our business uses. While I’m doing this, that little champion is also driving 4 large monitors because I’m actual scum like that. Still no hardware meltdowns after 3 years, but I’ll admit the cat likes how warm it gets.

          750k lines is just for the branch specific item preferences table for one of our smaller business streams, too - FORGET what our sales record tables would look like, let alone the whole database! And when we’re talking about the entirety of the social security database, which should contain at least one line each in a table somewhere for most of the hundreds of millions of people currently living in the US, PLUS any historical records for dead people??

          Your hard drive melting after 60k lines, plus the attitude that 60k lines is a lot for a major database, speaks to GLARING IT incompetence.

      • Fuck spez
        link
        fedilink
        English
        41 day ago

        I don’t think I’ve seen a brand new computer in the past decade that even had a mechanical hard drive at all unless it was purpose-built for storing multiple terabytes, and 60K rows wouldn’t even take multiple gigabytes.

      • Reminds me of those 90s ads about hackers making your pc explode.

        Musk gonna roll up in a wheelchair, “the attempt on my life has left me ketamine addicted and all knowing and powerful.”

      • @wise_pancake@lemmy.ca
        link
        fedilink
        11 day ago

        I have when a misconfigured spark job I was debugging was filling hard drives with tb of error logs and killing the drives.

        That was a pretty weird edge case though, and I don’t think the drives were melting, plus this was closer to 10 years ago when SSD write lifetimes were crappy and we bought a bad batch of drives.

    • @Bosht@lemmy.world
      link
      fedilink
      English
      361 day ago

      I’d much sooner assume that they’re just fucking stupid and talking out of their ass tbh.

      • @kautau@lemmy.world
        link
        fedilink
        16
        edit-2
        1 day ago

        Same as Elon when he confidently told off engineers during his takeover of Twitter or gestures broadly at the Mr. Dunning Kruger himself

        Wonder if it’s an SQL DB

        Elon probably hired confident right wingers whose parents bought and paid their way through prestigious schools. If he hired anyone truly skilled and knowledgeable, they’d call him out on his bullshit. So the people gutting government programs and passing around private data like candy are just confidently incorrect

    • @zenpocalypse@lemm.ee
      link
      fedilink
      English
      17
      edit-2
      1 day ago

      Even if it was local, a raspberry pi can handle a query that size.

      Edit - honestly, it reeks of a knowledge level that calls the entire PC a “hard drive”.

      • @T156@lemmy.world
        link
        fedilink
        English
        0
        edit-2
        19 hours ago

        Unless they actually mean the hard drive, and not the computer. I’ve definitely had a cheap enclosure overheat and drop out on me before when trying to seek the drive a bunch, although it’s more likely the enclosure’s own electronics overheating. Unless their query was rubbish, a simple database scan/search like that should be fast, and not demanding in the slightest. Doubly so if it’s dedicated, and not using some embedded thing like SQLite. A few dozen thousand queries should be basically nothing.

    • @GoodEye8@lemm.ee
      link
      fedilink
      English
      211 day ago

      My one question would be “How?”

      What the hell are you doing that your hard drives are overheating? How do you even know it’s overheating as I’m like 90% certain hard drives (except NVMe if we’re being liberal with the meaning of hard drive) don’t even have temperature sensors?

      The only conclusion I can come to is that everything he’s saying is just bullshit.

          • @Mniot@programming.dev
            link
            fedilink
            English
            41 day ago

            Can we think of any device someone might have that would struggle with 60k? Certainly an ESP32 chip could handle it fine, so most IoT devices would work…

            • @T156@lemmy.world
              link
              fedilink
              English
              219 hours ago

              Unless the database was designed by someone who only knows of data as that robot from Star Trek, most would be absolutely fine with 60k rows. I wouldn’t be surprised if the machine they’re using caches that much in RAM alone.

            • @zenpocalypse@lemm.ee
              link
              fedilink
              English
              31 day ago

              Right? There’s no part of that xeet that makes any real sense coming from a “data engineer.”

              Terrifying, really.

    • @Adalast@lemmy.world
      link
      fedilink
      41 day ago

      Why? Because they feel the need to have local copies of sensitive financial information because… You know… They are computer security experts.

  • Tiefling IRL
    link
    fedilink
    1151 day ago

    60k isn’t that much, I frequently run scripts against multiple hundreds of thousands at work. Wtf is he doing? Did he duplicate the government database onto his 2015 MacBook Air?

    • @4am@lemm.ee
      link
      fedilink
      531 day ago

      A TI-86 can query 60k rows without breaking a sweat.

      If his hard drive overheated from that, he is doing something very wrong, very unhygienic, or both.

    • @arotrios@lemmy.world
      link
      fedilink
      English
      71 day ago

      Seriously - I can parse multiple tables of 5+ million row each… in EXCEL… on a 10 year old desktop and not have the fan even speed up. Even the legacy Access database I work with handles multiple million+ row tables better than that.

      Sounds like the kid was running his AI hamsters too hard and they died of exhaustion.