I had a long running process (data import) and encountered 504 Gateway Timeout error. I use Nginx / uwsgi pair to serve my Django app.

After doing some research it turned out that the problem is in uwsgi_read_timeout directive. Directive sets the amount of time for upstream to wait for a uwsgi process to send response data. By default the value is 60 seconds. Set it to bigger value if you have a long running process.

I've just finished watching "AngularJS plus Django: A match made in heaven" DjangoCon US 2014 talk by Nina Zakharenko (Video | Slides) and it was quite helpful. Since I'm working with API's and mobile / angular apps a lot, I just made some helpful Django + AngularJS related notes that came out of this talk. Maybe these will be helpful for myself and, maybe, you :)

One of my Django apps uses Django Comments. To show a list of comments on a page I use this simple built-in template tag:

{% render_comment_list for object %}

I've just stumbled upon a very nice style guide for AngularJS on Github: https://github.com/johnpapa/angularjs-styleguide by John Papa.

In the guide you'll find recommended approaches to your AngularJS code organisation in order to provide consistency through good practices. A very useful material for everyone who's already got his AngularJS basics.

Been working a lot with AngularJS and Django recently. To make them play nicely together one of the first things you need to do in your Angular App is to enable CSRF support and send X-Requested-With header, do it like this:

PIP has a habit of re-downloading same packages every time you hit pip install package_name. That's not so cool, especially, when you deploy packages from requirements.txt and one of the packages failed to build, restarting pip install -r requirements.txt would lead pip to redownload all the packages again.

Thankfully there's a way to fix this easily, create a configuration file named ~/.pip/pip.conf, and add the following contents:

In Django, if your model has Filefield / Imagefield, when it's get deleted it doesn't delete attached files by default. To fix that you need to do two steps:

1) Add filecleanup function, that'll take care of deleting files attached to a model:

I had a case recently, where I needed to add custom data to the node display and wanted this data to behave like a field, however the data itself didn't belong to a field. By "behaving like a field" I mean you can that field at node display settings and able to control it's visibility, label and weight by dragging and dropping that field.

So, as you may have undestood, hook_preprocess_node / node_view_alter approach alone wasn't enough.

Very often we can have multiple Django apps each running celery and all this is installed on a single server. How do we isolate these separate app's celery workers? The answer is: Simple, if you use Rabbitmq as your broker.

Rabbitmq lets us to add multiple virtualhosts, so we can easily separate celery queues and the use different broker_url per our app, let's start:

Today while moving Django/PostgreSQL site between servers and importing an SQL dump into the new location I got a surprising error "ERROR: must be owner of extension plpgsql". I'm already used to PostgreSQL quirks so after a quick search I came to the following solution:

Login as postgres user and do:

Pages

Subscribe to TimOnWeb RSS