At last, success!
Yesterday was the day. It all paid off.
I wouldn't have been able to do it so fast, especially the debugging, without Ai.
But it is working! I have the first version of my scraper live (in my development environmnent) in Django!

20 jobs scraped and saved to the DB for further processing.
The scraping part was not difficult at all. I could use a lot of the css selectors from the Scrapy project. What was a bit difficult was getting Celery to work while dealing with the new conventions of this boilerplate and Docker all at the same time.
Things to do:
- Get scraper to work with pagination (for now, it just scrapes the first 20 jobs that appear in the first page).
- After pagination works, make the scraper run on a schedule with Celery.
- Re-use the code from the Language Detector from the last project to build one and connect it to this new project. Make it work as a Celery worker.
- Re-use the code from the Job Categorizer from the last project and incorporate it into this project, and make it work as a Celery worker also.
- Port as much of the frontend as possible.
- Buy a server with Hetzner, bring the project live, first into a password protected staging environment.