Django raise AppRegistryNotReady("Apps aren't loaded yet.") django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet

django apps
app registry not ready django
django installed_apps
django apps folder
django app_label
importerror cannot import name appregistrynotready
django models tutorial
django settings module not found

I am trying to use pyspark to preprocess data for the prediction model. I get an error when I try spark.createDataFrame out of my preprocessing.Is there a way to check how processedRDD look like before making it to dataframe?

    import findspark
    findspark.init('/usr/local/spark')
    import pyspark
    from pyspark.sql import SQLContext
    import os
    import pandas as pd
    import geohash2

    sc = pyspark.SparkContext('local', 'sentinel')
    spark = pyspark.SQLContext(sc)
    sql = SQLContext(sc)
    working_dir = os.getcwd()

    df = sql.createDataFrame(data)

    df = df.select(['starttime', 'latstart','lonstart', 'latfinish', 'lonfinish', 'trip_type'])
    df.show(10, False)
    processedRDD = df.rdd
    processedRDD = processedRDD \
                    .map(lambda row: (row, g, b, minutes_per_bin)) \
                    .map(data_cleaner) \
                    .filter(lambda row: row != None)
    print(processedRDD)
    featuredDf = spark.createDataFrame(processedRDD, ['year', 'month', 'day', 'time_cat', 'time_num', 'time_cos', \
                                              'time_sin', 'day_cat', 'day_num', 'day_cos', 'day_sin', 'weekend', \
                                              'x_start', 'y_start', 'z_start','location_start', 'location_end', 'trip_type'])

I am getting this error:

[Stage 1:>                                                          (0 + 1) / 1]2019-10-24 15:37:56 ERROR Executor:91 - Exception in task 0.0 in stage 1.0 (TID 1)


raise AppRegistryNotReady("Apps aren't loaded yet.") django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet.

at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.handlePythonException(PythonRunner.scala:452)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:588)
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRunner.scala:571)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:406)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at org.apache.spark.InterruptibleIterator.foreach(InterruptibleIterator.scala:28)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at org.apache.spark.InterruptibleIterator.to(InterruptibleIterator.scala:28)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at org.apache.spark.InterruptibleIterator.toBuffer(InterruptibleIterator.scala:28)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at org.apache.spark.InterruptibleIterator.toArray(InterruptibleIterator.scala:28)
at org.apache.spark.api.python.PythonRDD$$anonfun$3.apply(PythonRDD.scala:153)
at org.apache.spark.api.python.PythonRDD$$anonfun$3.apply(PythonRDD.scala:153)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more

I do not understand what this have to do with importing an app

Basically, you need to load your settings and populate Django’s application registry before doing anything else. You have all the information required in the Django docs.

https://docs.djangoproject.com/en/2.2/topics/settings/#calling-django-setup-is-required-for-standalone-django-usage

django:django.core.exceptions.AppRegistryNotReady: Apps aren't , py", line 124, in check_apps_ready raise AppRegistryNotReady("Apps aren't loaded yet.") django.core.exceptions.AppRegistryNotReady: Apps  The following are code examples for showing how to use django.core.exceptions.AppRegistryNotReady().They are from open source Python projects. You can vote up the examples you like or vote down the ones you don't like.

I don't know what this script has to do with Django exactly, but adding the following lines at the top of the script will probably fix this issue:

import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')
import django
django.setup()

#25510 (Putting invalid app name in INSTALLED_APPS makes , django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. Other commands such as migrate or check correctly raise an ImportError. Oldest first In Django 2.0 with settings that should raise django.exceptions.ImproperlyConfigured, running manage.py makemigrations will raise django.exceptions.AppRegistryNotReady instead. This does not occur in Django 1.11.

Instead of manualy running Hadoop I am making a python server which is using pyspack and calculate 10 times faster heavy AI algorithms on Django server. The problem I had came from SPARK-LOCAL-IP, different IP was used (the one I use to connect to a remote database vis sshtunnel). I import and use pyspark. I had to rename a file and add the correct IP.

 cd /usr/local/spark/conf
 touch spark-env.sh.template
 mv -i spark-env.sh.template spark-env.sh
 nano spark-env.sh
 paste: SPARK-LOCAL_IP="127.0.1.1"

Then I had to add to my views.py sc.setLogLevel("ERROR") To see what was the real problem .Debuging of java in python can be problematic sometimes. A column was datetime instead of string and I fixed it.

django.core.exceptions.AppRegistryNotReady: Apps aren't , If you try to access an app before setup has run the function check_apps_ready will raise AppRegistryNotReady("Apps aren't loaded yet.")  Sync database via django 1.6 version (python manage.py syncdb). Explicity install Django 1.7.5 (pip install Django==1.7.5). Original Answer Post: I think I've narrowed it down a bit, if not in scope, at least in root cause. Though my project wasn't an upgrade from Django <=1.6 to 1.7, this is nonetheless an issue.

django:django.core.exceptions.AppRegistryNotReady , raise AppRegistryNotReady("Apps aren't loaded yet.") django.core.exceptions.​AppRegistryNotReady: Apps aren't loaded yet. The django version  raise AppRegistryNotReady ("Apps aren't loaded yet.") django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. The django version on the server is 1.8.5, and the local is 1.8.1. I doubt the version may cause this problem.

The infamous django.core.exceptions.AppRegistryNotReady: Apps , AppRegistryNotReady: Apps aren't loaded yet. Hi,. I'm having problems with the infamous django.core.exceptions.AppRegistryNotReady  raise AppRegistryNotReady("Apps aren't loaded yet.") django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet. I modified the Installed apps for 'django.contrib.auth'.

Django Exceptions, AppRegistryNotReady ¶. exception AppRegistryNotReady [source]¶. This exception is raised when attempting to use models before the app  Trying to use this package in my project. I'm done everything as in the manual, but I'm getting this error: django.core.exceptions.AppRegistryNotReady: Apps aren't loaded yet.

Comments
  • I don't even see where django is referred to in this script... What is findspark?
  • it something like a handle to use spark in my Django server so calculations can be done 10 times faster and cheaper