fork - Would it be a good idea to make python store compile code in file stream instead of pyc files? -


I'm wondering if it would not be better if the original file will store the compiled code in a file stream of Python origin.

  • On Windows using ADS (Optional Data Stream)
  • Using this OS X processing forks
  • If the file is compiled 32k

on Linux using extended file properties, by doing so polluting the source tree or after removing problems like .py after the .pyc was created and loaded and used.

What do you think about it, or do not you think like a good idea? What are the problems to see.

You surely sacrifice lots of such portability - just now. .pyc Files are abnormally portable (usually used by odd systems on a LAN through a network file system system, for example, although I was never a fan of the performance characteristics of that approach) , While your vision is only specific Will work on the system and (I suspect) never loads a network mount on a heterogeneous machine.

Therefore, you want to make the behavior a default - but surely it should definitely be available as a option for a specific request if all of the above are in your deployment environment. Issues are not concerned and some of them mention you, there is another "good option", that I will actually use almost 100 times more often, it is called .pyc "files" By entering database Instead of placing them in the filesystem

The good thing is that it is easily accomplished as an add-on "import hack" (based on Python Vernon) - most recently, in the most recent editions, Brett Canon's masterpiece (but it can make backporting by making older Python versions harder by other means ... very much really depends on the versions you rely on, a detailed explanation that I can tell in your cue Does not appear, so I do not have to go into the details of the implementation, but there is not much change in the general idea deployment).


Comments