What data construct to allow multi-key access, even to other keys?

how to access key in dictionary python
python dictionary string key
key-value pair python
python add multiple items to dictionary
how to create dictionary and add key, value pairs dynamically in python
value of key in dictionary python
how to get keys from dictionary in python
create a dictionary d2 with key values p play t talk

Before constructing some new object to perform this task, perhaps someone can point me to an existing solution or tell me what search terms I should use to research further.

My problem is that I have a small collection of related data that comes up again and again in my problem domain.

For example, suppose my company has a single field office in each state's capital:

'Name': 'Alabama',  'Short': 'AL', 'Capital': 'Montgomery', 'Zip': 36043
'Name': 'Colorado', 'Short': 'CO', 'Capital': 'Denver', 'Zip': 80014
'Name': 'Florida', 'Short': 'FL', 'Capital': 'Tallahassee', 'Zip': 32301

Note: There are no duplicate values.

Each business unit wants to access the zip code differently.

Accounting wants to use the state's two letter abbreviation to access the zip code (Key-to-Data).

Finance wants to use the full state name to access the zip code (Key-to-Data).

Legal (always the difficult one) would also like to get the state name given the capital (Key-to-Key).

The number of offices is small -- usually less that 4, so I would like to avoid a "heavy" approach such as a database.

Ideally, I would like to simply define this data at the top of the script as a single define.

I could of course define multiple dicts, with each key pointing to the zip code and yet another few dicts doing Key-to-Key mappings, but that seems repetive and error-prone.

What's an elegant way to implement and access this type of many-to-one but unique data?

Here's the solution I wound up using to access unique data with any of multiple key encodings. Keep in mind this is a contrived example to illustrate/simplify the real problem.

import collections

OfficeEntry = collections.namedtuple('OfficeEntry', ['name', 'short', 'capitol', 'zip'])

class OfficeNameDecoder():

    def __init__(self, office_entries):
        self.data = office_entries

    def __getitem__(self, key):
        for idx, li in enumerate(self.data):
            if key in li:
                return li

        raise IndexError()

If we instantiate the OfficeNameDecoder with some sample data:

SAMPLE_OFFICE_ENTRIES = [ OfficeEntry('Alabama',  'AL', 'Montgomery', 36043),
                          OfficeEntry('Colorado', 'CO', 'Denver',     80014),
                          OfficeEntry('Florida',  'FL', 'Tallahasse', 32301) ]

ond = OfficeNameDecoder(SAMPLE_OFFICE_ENTRIES)

Accounting can get field office zip codes by doing:

>> 36043

Finance can get field office zip codes using:

>> 80014

and I can finally stop fielding phone calls from Legal for state abbreviations:

>> 'FL'

[PDF] Multi-Key Searchable Encryption, We construct a searchable encryption scheme that enables keyword search over data data encrypted under different keys with users only accessing a subset of them. Other documents, even if those documents are encrypted under different keys. Let's now consider a more concrete model for such a multi-key search. "Multi-entry" keys allow the free choice of characters only in the first step, whereas in a typical multi-access key the choice of characters used for identification can be repeated multiple times (reducing the number of remaining taxa each time).

Pandas are quite powerful at this kind of stuff. If you are open to using it, here's the code for it.

Define a data frame:

>>> import pandas as pd
>>> df = pd.DataFrame({'Name': ['Alabama', 'Colorado', 'Florida'], 'Short': ['AL', 'CO', 'FL'], 'Capital': ['Montgomery', 'Denver', 'Tallahassee'], 'Zip': [36043, 80014, 32301]})
>>> df
       Capital      Name Short    Zip
0   Montgomery   Alabama    AL  36043
1       Denver  Colorado    CO  80014
2  Tallahassee   Florida    FL  32301

Using state's 2-letter abbreviation to get the zip code:

>>> df[df['Short'] == 'AL']['Zip'].values[0]

Using state's name to get the zip code:

>>> df[df['Name'] == 'Colorado']['Zip'].values[0]

Using state's capital to get the state name:

>>> df[df['Capital'] == 'Tallahassee']['Name'].values[0]

Connecting tweakable and multi-key blockcipher security , We show an equivalence between blockcipher multi-key security and tweakable blockcipher security. How long a key can be used and how much data it can process is even though analyzing cryptographic algorithms in the multi-key (​KDFs), which use a master key to output many different keys. A key in a table is simply an attribute that is used to identify and access that information. A table can and often will have multiple keys, such as in the table Users both email and username could be considered keys.

Unfortunately, I don't think there's a non-database solution for what you are looking for. Two-way memory-efficient injective mappings aren't native to Python, or perhaps to any programming language.

For small applications where you can guarantee that mappings between fields are injective, you can define multiple dictionaries. This need not be error-prone, you can use collections.defaultdict and itertools.permutations to create a dictionary of dictionaries. I only advise this route if your data is small but used many times.

from collections import defaultdict
from itertools import permutations

data = [{'Name': 'Alabama',  'Short': 'AL', 'Capital': 'Montgomery', 'Zip': 36043},
        {'Name': 'Colorado', 'Short': 'CO', 'Capital': 'Denver', 'Zip': 80014},
        {'Name': 'Florida', 'Short': 'FL', 'Capital': 'Tallahassee', 'Zip': 32301}]

d = defaultdict(dict)

for item in data:
    for key1, key2 in permutations(item.keys(), 2):
        d[(key1, key2)][item[key1]] = item[key2]

            {('Capital', 'Name'): {'Denver': 'Colorado', 'Montgomery': 'Alabama',
                                   'Tallahassee': 'Florida'},
             ('Capital', 'Short'): {'Denver': 'CO', 'Montgomery': 'AL',
                                    'Tallahassee': 'FL'},
             ('Zip', 'Name'): {32301: 'Florida', 36043: 'Alabama', 80014: 'Colorado'},
             ('Zip', 'Short'): {32301: 'FL', 36043: 'AL', 80014: 'CO'}})

20. Dictionaries, All of the compound data types we have studied in detail so far — strings, lists, and implemented using a technique called hashing, which allows us to access a Another way to create a dictionary is to provide a list of key:value pairs using The values in a dictionary are accessed with keys, not with indices, so there is​  then you can use equal join to join these 3 fields in 1 query (table1's Account Number join with table2's Accout Number, table 1's Date of Service join with table 2's Date of Service, etc). The SQL of joining 3 fields means selecting data if table1's account number = table2's account number AND table1's physician number = table2's

[PDF] Multi-Key Security: The Even-Mansour Construction Revisited, ing the same security as other block ciphers against generic attacks. In under multiple independent keys, the Even-Mansour construction sur- relevance, as real-world implementations often allow frequent rekeying. stream cipher time-​memory-data tradeoffs can be applied to the block cipher are constructed. Anyway, here's how you create a multi-column unique index on an MS access database. Open the table in design mode, and Design, select Indexes. Create a new row and enter a value in the Index Name cell, Choose the first column from the drop down menu. Add a new row and leave the Index Name cell blank. Choose the second column, and so on.

Computerworld, This facility dynamically merged related data items from several tables into Boolean queries with multi-key and partial- key searches happen in seconds, The secret is System 1022's proprietary, inverted-file indexing and access method. You can even write your own interfaces to other software packages and other  relations using the relations editor in access. even though i could have multiple foreign keys to a table access did allow the referential integrity. my database structure is table1 - students table table 2 - subjects table table 3 - subjects opted by students in the table 3 totally there are 6 subject columns and a student can

[PDF] A Formal Treatment of Multi-key Channels, and instead deploys a sequential series of multiple keys. later allow adversaries to tamper with the timing of key updates) but place, even if one endpoint of the channel is still processing data in these phases when the corruption While the adversary always has access to a left-or-right encryption. The obvious implementation is a dictionary mapping from keys (lhs in your current code) to lists of values, one value per line with the same key. The values would also be lists, with one inner value per term in the input ( rhs.split() in your code).

  • I'd be wary of some of the assumptions here - for example, that you never need to deal with two offices in the same state, or two cities with the same name, or multi-state ZIP codes.
  • The data above is for example only. I used it because it is more approachable than the short strings and hex codes of the actual (embedded) data. Each data point is always unique and there are never more than 6 rows, usually only 3-4 rows.
  • It sounds like you should be using a real database, at least SQLite. It's okay for a rarely-changing SQL table to have many indexes.
  • 2-way memory-efficient injective mappings aren't native to Python (or any programming language I'm aware of), you would need to code this yourself in a lower-level language with a perfect hash. In my opinion, there should be an implementation for such a use case, at least when you have a small number of items. The counter-argument, of course, is "use a database".
  • Ouch, that algorithmic complexity.