Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I understand that if you're building a PEX file then all dependencies must be reinstalled into it every time, however you might still be able to leverage container layer caching to save the download time.

CI bills are aweful, I always deploy my own CI server, a gitlab-runner where I also spawn a Traefik instance to practice eXtreme DevOps.

More than 20 daily contributors that's nice, but I must admit that I have contributed to some major python projects that don't have a packaging problem, such as Ansible or Django. So, I'm not sure if the number of contributors is really a factor in packaging success. That said, sdist and well are things that happen in CI for me, it's just adding this to my .gitlab-ci.yml:

    pypi:
        stage: deploy
        script: pypi-release
And adding TWINE_{USERNAME,PASSWORD} to CI. The other trick is to use the excellent setupmeta or something like that (OpenStack also has a solution) so that setup.py discovers the version based on the git tag or publishes a dev version.

That's how I automate the packaging of all my Python packages (I have something similar for my NPM packages). As for virtualenvs, it's true that they are great but I don't use them, I use pip install --user, which has the drawback that you need all your software to run with the latest releases of dependencies, otherwise you have to contribute the fixes, but I'm a more happy developer this way, and my colleagues aren't blocked by a breaking upstream release very often, they will just pin a version if they need to keep working while somebody takes care of changing our code and contribute to dependencies to make everything work with latest versions.

I don't think that other languages are immune to version compatibility issues, I don't think that problem is language dependent, either you pin your versions and forget about upstream releases, either you aggressively integrate upstream releases continuously in your code and your dependencies.

> My app will never need to be fast

I maintain a governmental service that was in production in less than 3 months, then 21 months of continuous development, serving 60m citizen with a few thousand administrators, as sole techie, on a single server, for the third year. Needless to say, my country has never seen such a fast and useful project. I have not optimized anything. Of course you can imagine it's not my first project in this case. For me, Python's speed most often not a problem is not a lie, I proved it.

The project does have a slightly complex database, the administration interface does implement really tight permission granularity (each department has its own admin team with users of different roles), it did have to iterate quickly, but you know the story with Django : changing the DB schema is easy, migrations are generated by Django, you can write data migrations easily, tests will tell you what you broke, you write new tests (I also use snapshot testing so a lot of my tests actually write themselves), and upgrading a package is just as easy as fixing anything that broke when running the tests.

You seem to think that Python is outdated because it's old, and that's also what I thought when I went over all alternative for my 10 next years of app devs. I was ready to trash all my Python really. But that's how I figured that the human-computer problem Python solves will just always be relevant. I'll assume that you understand the point I made on that and that we simply disagree here.

Or maybe we don't really disagree, I'll agree with you that a compiled language is better for mission-critical components, but any of these will almost always need a CRUD and that's where Python shines.

But I've not always been making CRUDs with Python, I have 2 years of experience as an OpenStack developer, and I must admit that Python fit the bill pretty well here too. Maybe my cloud company was not big enough to have problems, or we just avoided the common mistakes. I know people like Rackspace had hard times maintaining forks of the services, I was the sole maintainer of 4 network services rewrites which were basically 1 package using OpenStack as a framework (like I would use Django), to simply listen on RabbitMQ and do stuff on SDN and SSH. Then again, I think not so much people actually practice CI/CD correctly, so that's definitely going to be a problem for them at some point.

> there's not currently a solid Go-alternative for django

That's one of the things that put me of, I tried all Go web frameworks, and they are pretty cool, but will they ever reach the productivity levels of Django, Rails or Symfony ?

Meanwhile, I'm just waiting for the day someone puts me in charge of something where performance is sufficiently performance-critical that I need to rewrite it in a compiled language, if I could have the chance to do some ASM optimizations that would also be a lot of fun. Another option is that I have something to contribute to a Go project, but so far, Go developers seem doing really fine without me for sure :)

While I choose it for general purpose development ? I guess I'm stuck with "I love OOP" just like "the little functional programing Python offers".

I really enjoyed this conversation too, would like to share it on my blog if you don't mind, thank you for your time, have a great weekend.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: