I don't want to bug the OpalStack staff while they're so busy with the onslaught of WebFaction customers this week, so I thought I'd see what advice I can get from the community at large…

I have a Django application with uWSGI and PostgreSQL on my shared-hosting-plan account. Since my projects are mostly just for fun, up until now I've done all my development "live" on the server. But I'd like to establish a separate development environment, preferably local, where I can work on the code and then release it to production when it is ready—if for no other reason than just to establish the habit.

My thought was to create a version-control repository from my existing production code on the server, which I could then check out to my machine, edit locally, and push back to production when appropriate.

My Django code, templates, static files, etc. seem straightforward enough to add to a repo, but how do I handle the database? One of the reasons I chose Django and OpalStack was because they almost completely abstract away the DB layer, so it's all pretty opaque to me. I'm not even sure I know enough to know what it is that I am asking.

Couple of suggestions from me:

  1. Use Github or similar (whatever rings your bell, if you're more Mercurial guy it's more than ok! Bitbucket then). Anyway, some separate public place where you can have your private repository. In this way the first step in your journey would be creating a repo, storing current items in it, pushing it to github and then pulling it from github in your "production".
  2. How familiar you're with Django? If you're following the Django's pattern in managing databases you're never directly managing databases but you create changes to your code and django automagically creates migration code for that. So that you don't need to sync databases per se but you just run the migrate on the prod server and it does schema things for you.
  3. Fixtures or data needed for production is a bit more complex way but you can do that in migrations as well. See this: https://docs.djangoproject.com/en/3.1/topics/migrations/

When you get that manual pipe working (where you need to type these commands by hand) you can automate it. Familiarize yourself with some ci/cd tools, for example Github Actions.

    I'll add to Fjordis' excellent suggestions.

    For version control, I recommend GitHub, because their GitHub Desktop app may make your transition to using git a little easier.

    After that, you'll need to setup your local environment.

    • You can install Python and Postgres locally. If you're on a mac, homebrew makes this pretty easy. You'll probably want to use python virtualenv if you take this route, as a way to keep your app's python modules isolated.
    • Or use Docker. It trades the the trouble of installing and maintaining a local Postgres and Python environment, for the trouble of dealing with Docker 🙂. Here's a nice guide for doing this from Docker themselves: https://docs.docker.com/compose/django/

    I've done both. I started with direct installs, and moved to Docker. They each have their downsides.

    To adapt your app so that it can run either locally or on Opalstack, you'll need to abstract some of your configuration. There's a few approaches, you can start with this StackOverflow thread.

    Expect a learning curve. It's a pretty big leap going from editing files on the server to using version control and a local environment. But it's a great skill, and opens the door to collaborating with other developers, which can be really fun.

      You didn't say what sort of computer you're running locally, but the easiest way to run postgress on a Mac: https://postgresapp.com.

      Docker, which is not a small lift, is probably the way to get closest to life on Opalstack. Again, on a Mac, you can get bit by things like case-sensitivity in filenames even if you're using a virtual env.

      Fjordis Thanks for the suggestions. Fixtures = DB rows, essentially? This was the thing I was wondering most about. From the link it sounds like there is a way to put all the information needed to reconstruct the DB contents into a migrations file, rather than including the DB itself in the repo? I'll follow more links to understand the details when I get to that point; for now I just want to be clear on the overall picture of what I'd be accomplishing there.

      Fjordis, chad Re: version control, what is the benefit of having the repo stored in a cloud service like github, as opposed to using just git alone? Is that just future-proofing against the case where I eventually want to collaborate, or is there some benefit even if I am the only one editing the code?

        thecommexokid

        RE: version control. If the goal is just to have version history, using a cloud service is not a big deal. You'd get some little things like tools to visualize your history. However, if the goal is to learn about version control, it's more important. There's aspects of version control that will only come up if you use it with a remote repo.

        RE: fixtures. A typical use case for fixtures is to provide the default data for a fresh install of your app. Say you were making a CMS. You wouldn't want your Articles table in your fixtures, but you might want your Categories table.

        I wouldn't recommend using fixtures as a way to synchronize the complete database. It'd be cumbersome. For that, dumping and loading an sql file using the postgres command line tools is probably the simplest way.

        Lots of good advice here already.

        I also run local dev environments but depending on the project I use Vagrant (https://www.vagrantup.com/) but am increasingly moving to Docker. I would strongly advise against installing Python and/or Postgres locally - you'll run into local version issues at some point (especially if you have more than one project or are doing version upgrades) and it's messy to unpick that stuff. Use Docker or Vagrant instead and install Python in a virtualenv - keep good notes on how you set things up and ideally script it so you can destroy/rebuild easily.

        I keep my code in Bitbucket and use pipelines (https://support.atlassian.com/bitbucket-cloud/docs/get-started-with-bitbucket-pipelines/) to automatically run tests on commits. I also run tests locally before a commit. I deploy the code to production by pulling from the git repo after all the tests have passed so I shouldn't have to update code manually. Using a cloud service gives lots of benefits. There are minor things like backups but you can also search your code easily and get easy access to graphical diffs and so on. Also, it's one less thing to worry about.

        I agree with what @chad said about fixtures. These are best used for setting up an initial state for the application and, most importantly, to support your automated tests. They're not a good way to keep in-sync with production. In terms of migrations then you only want to use schema migrations when the database schema changes and data migrations if you need to change data - e.g. you decide to add a database field to store people's initials and you'd then use a data migration to populate the new field from existing data (i.e. by getting the first letter from the first_name and last_name fields etc) - migrations are not a good way of sync'ing data between systems though.

        I try to avoid having production data in a dev/test environment as that's undesirable for lots of reasons. If I do have to do something with prod data then I dump the data as SQL commands from the prod database, import it into a dev environment and then run some Django management commands that I wrote over it to anonymise it locally (before binning the dumped prod file). These commands need to be specific for the application since it relates directly to the nature of the data you store but is good practice and means you can play about with a "live" dataset without being concerned about the data etc.

        Also, it's worth saying that you could just create a new app and a new database on Opalstack and then set that up as a dev/demo instance. This reduces risk of you causing issues with the production instance. In my case I tend to have local dev instances and then a demo instance and prod instance. The demo instance has newer versions of the code running in a production-like environment (e.g. on Opalstack) so is as-live as possible. By having a demo instance too then you can always get some users to test it out if needs be.

        Hope that helps.

        5 months later

        thecommexokid I apologise if I'm stating the obvious here, but I would say the big benefit of pushing to GitHub is that it backs up your project's entire history. If you have cloud back-up for your whole system, including all your .git directories, you get that as well, but a lot of people don't, so I thought it worth mentioning.

        (Personally, my whole home directory contains symlinks to a buncho' git repos, a separate cloud storage directory, and a 'nocloud' directory which I'm grossly overdue for backing up to an external drive. I've been meaning to change the script that notifies me if I haven't pushed any of my git repos for a while, to also nag me about that. 😉)

        BTW, I'm glad you started this thread, because, six months later, I'm finding it very helpful. And I was one of the WebFaction migrants, so I also appreciate you keeping Opalstack's excellent support people free for me during those anxiety-prone months.

        a month later

        Interesting stuff and good tips. It's nice to know how others go about their stuff and pushing their work to Opalstack.

        I would recommend Docker (https://www.docker.com/) as well, I find it vital for local development. It has a bit of a steep learning curve, but once you get the hang of it, it's easy to use and well worth the trouble.

        Using Git (https://git-scm.com/) is also definitely recommended, if only for the code history. But it feels safe to work in a branch, and if things go wrong, you can always go back to a working version of your project. Definitely a must have!

        I personally don't use Github for deployments of my websites. In fact, to answer the OP, I've built a tool for deployment of local dev environment, which may be useful to others here. See here for more info: https://community.opalstack.com/d/696

        Mastodon