Data

exception psynet.data.InvalidDefinitionError[source]

Bases: ValueError

InvalidDefinitionError class

class psynet.data.SQLMixin(*args, **kwargs)[source]

Bases: SQLMixinDallinger

We apply this mixin when creating our own SQL-backed classes from scratch. For example:

from psynet.data import SQLBase, SQLMixin, register_table

@register_table
class Bird(SQLBase, SQLMixin):
    __tablename__ = "bird"

class Sparrow(Bird):
    pass
class psynet.data.SQLMixinDallinger(*args, **kwargs)[source]

Bases: SharedMixin

We apply this Mixin class when subclassing Dallinger classes, for example Network and Info. It adds a few useful exporting features, but most importantly it adds automatic mapping logic, so that polymorphic identities are constructed automatically from class names instead of having to be specified manually. For example:

from dallinger.models import Info

class CustomInfo(Info)
    pass
scrub_pii(json)[source]

Removes personally identifying information from the object’s JSON representation. This is a destructive operation (it changes the input object).

to_dict()[source]

Determines the information that is shown for this object in the dashboard and in the csv files generated by psynet export.

psynet.data.drop_all_db_tables(bind=Engine(postgresql://dallinger@postgres:5432/dallinger))[source]

Drops all tables from the Postgres database. Includes a workaround for the fact that SQLAlchemy doesn’t provide a CASCADE option to drop_all, which was causing errors with Dallinger’s version of database resetting in init_db.

(https://github.com/pallets-eco/flask-sqlalchemy/issues/722)

psynet.data.dump_db_to_disk(dir, scrub_pii)[source]

Exports all database objects to JSON-style dictionaries and writes them to CSV files, one for each class type.

Parameters:
  • dir – Directory to which the CSV files should be exported.

  • scrub_pii (bool) – Whether to remove personally identifying information.

psynet.data.get_db_tables()[source]

Lists the tables in the database.

Returns:

  • A dictionary where the keys identify the tables and the values are the table objects themselves.

psynet.data.get_sql_base_class(x)[source]

Return the SQLAlchemy base class of an object x, returning None if no such base class is found.

psynet.data.ingest_to_model(file, model, engine=None, clear_columns=None, replace_columns=None)[source]

Imports a CSV file to the database. The implementation is similar to dallinger.data.ingest_to_model, but incorporates a few extra parameters (clear_columns, replace_columns) and does not fail for tables without an id column.

Parameters:
  • file – CSV file to import (specified as a file handler, created for example by open())

  • model – SQLAlchemy class corresponding to the objects that should be created.

  • clear_columns (Optional[List]) – Optional list of columns to clear when importing the CSV file. This is useful in the case of foreign-key constraints (e.g. participant IDs).

  • replace_columns (Optional[dict]) – Optional dictionary of values to set for particular columns.

psynet.data.ingest_zip(path, engine=None)[source]

Given a path to a zip file created with export(), recreate the database with the data stored in the included .csv files. This is a patched version of dallinger.data.ingest_zip that incorporates support for custom tables.

psynet.data.register_table(cls)[source]

This decorator should be applied whenever defining a new SQLAlchemy table. For example:

@register_table
class Bird(SQLBase, SQLMixin):
    __tablename__ = "bird"
psynet.data.sql_base_classes()[source]

Lists the base classes underpinning the different SQL tables used by PsyNet, including both base classes defined in Dallinger (e.g. Node, Info) and additional classes defined in custom PsyNet tables.

Returns:

  • A dictionary of base classes (e.g. Node), keyed by the corresponding

  • table names for those base classes (e.g. node).

psynet.data.update_dashboard_models()[source]

Determines the list of objects in the dashboard database browser.