That's a very limited solution though and it's super incomplete . Last I tried, the user needed to install python first and you can't bundle it. And it's not a myth, it's something I have experienced myself. Pyinstaller falls apart the moment you try to pull packages that are what make python useful in the first place for tons of people (numpy, pytorch, etc).
PyInstaller does not require that the target machine have Python installed.
And it will produce a single-file executable that bundles the interpreter, application Python code, extensions, and any other resources (data files, images, fonts, etc).
Both numpy and pytorch are explicitly supported, although that's not guaranteed for all extension packages.
None of that is new, it's been the way PyInstaller always worked.
However on Linux things kinda suck because PyInstaller will package your current interpreter, so if it's linked against things that are different on another Linux (eg no glibc) it will often crap itself. But so is the joy of distributing compiled software to Linux users :).
This is a genuine issue, and it takes some work to avoid.
I typically use an older distribution as my build box: this means that the dependencies pulled in by the interpreter are older versions, and when it runs on a newer distribution, the backward compatibility for libraries will ensure it works ... usually.
I've messed about with using a completely independent build tree for this: something that depends on libc/libm/etc only from the OS, and all other dependencies are part of my build. That seems like it works pretty universally, but it's a lot of work.
I've been meaning to look into leveraging the Flatpak runtimes for this: they seem like they have pretty similar concerns.