././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1742921762.6041298 dbf-0.99.10/0000775000175000017500000000000014770560043011455 5ustar00ethanethan././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1742921762.6041298 dbf-0.99.10/PKG-INFO0000664000175000017500000000300014770560043012543 0ustar00ethanethanMetadata-Version: 2.1 Name: dbf Version: 0.99.10 Summary: Pure python package for reading/writing dBase, FoxPro, and Visual FoxPro .dbf files (including memos) Home-page: https://github.com/ethanfurman/dbf Author: Ethan Furman Author-email: ethan@stoneleaf.us License: BSD License Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Programming Language :: Python Classifier: Topic :: Database Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.13 Provides: dbf Currently supports dBase III, Clipper, FoxPro, and Visual FoxPro tables. Text is returned as unicode, and codepage settings in tables are honored. Memos and Null fields are supported. Documentation needs work, but author is very responsive to e-mails. Not supported: index files (but can create tempory non-file indexes), auto-incrementing fields, and Varchar fields. Installation: `pip install dbf` ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1742921762.6041298 dbf-0.99.10/dbf/0000775000175000017500000000000014770560043012210 5ustar00ethanethan././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/LICENSE0000664000175000017500000000274714770560041013225 0ustar00ethanethanCopyright (c) 2008-2019 Ethan Furman All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. Neither the name Ethan Furman nor the names of any contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/README.md0000664000175000017500000001065114770560041013470 0ustar00ethanethandbf === dbf (also known as python dbase) is a module for reading/writing dBase III, FP, VFP, and Clipper .dbf database files. It's an ancient format that still finds lots of use (the most common I'm aware of is retrieving legacy data so it can be stored in a newer database system; other uses include GIS, stand-alone programs such as Family History, Personal Finance, etc.). Highlights ---------- Table -- represents a single .dbf/.dbt (or .fpt) file combination and provides access to records; suports the sequence access and 'with' protocols. Temporary tables can also live entirely in memory. Record -- repesents a single record/row in the table, with field access returning native or custom data types; supports the sequence, mapping, attribute access (with the field names as the attributes), and 'with' protocols. Updates to a record object are reflected on disk either immediately (using gather() or write()), or at the end of a 'with' statement. Index -- nonpersistent index for a table. Fields:: dBase III (Null not supported) Character --> unicode Date --> datetime.date or None Logical --> bool or None Memo --> unicode or None Numeric --> int/float depending on field definition or None Float --> same as numeric Clipper (Null not supported) Character --> unicode (character fields can be up to 65,519) Foxpro (Null supported) General --> str (treated as binary) Picture --> str (treated as binary) Visual Foxpro (Null supported) Currency --> decimal.Decimal douBle --> float Integer --> int dateTime --> datetime.datetime If a field is uninitialized (Date, Logical, Numeric, Memo, General, Picture) then None is returned for the value. Custom data types:: Null --> used to support Null values Char --> unicode type that auto-trims trailing whitespace, and ignores trailing whitespace for comparisons Date --> date object that allows for no date DateTime --> datetime object that allows for no datetime Time --> time object that allows for no time Logical --> adds Unknown state to bool's: instead of True/False/None, values are Truth, Falsth, and Unknown, with appropriate tri-state logic; just as bool(None) is False, bool(Unknown) is also False; the numerical values of Falsth, Truth, and Unknown is 0, 1, 2 Quantum --> similar to Logical, but implements boolean algebra (I think). Has states of Off, On, and Other. Other has no boolean nor numerical value, and attempts to use it as such will raise an exception Whirlwind Tour -------------- import datetime import dbf # create an in-memory table table = dbf.Table( filename='test', field_specs='name C(25); age N(3,0); birth D; qualified L', on_disk=False, ) table.open(dbf.READ_WRITE) # add some records to it for datum in ( ('Spanky', 7, dbf.Date.fromymd('20010315'), False), ('Spunky', 23, dbf.Date(1989, 7, 23), True), ('Sparky', 99, dbf.Date(), dbf.Unknown), ): table.append(datum) # iterate over the table, and print the records for record in table: print(record) print('--------') print(record[0:3]) print([record.name, record.age, record.birth]) print('--------') # make a copy of the test table (structure, not data) custom = table.new( filename='test_on_disk.dbf', default_data_types=dict(C=dbf.Char, D=dbf.Date, L=dbf.Logical), ) # automatically opened and closed with custom: # copy records from test to custom for record in table: custom.append(record) # modify each record in custom (could have done this in prior step) for record in custom: dbf.write(record, name=record.name.upper()) # and print the modified record print(record) print('--------') print(record[0:3]) print([record.name, record.age, record.birth]) print('--------') table.close() ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/WHATSNEW0000664000175000017500000002626014770560041013377 0ustar00ethanethanWhat's New ========== 0.99.001 -------- Vapor objects are now Falsey '@' TimeStamp column type added to db3 tables fix encoding error when exporting csv files 0.99.000 -------- BACKWARDS INCOMPATIBLE CHANGE: new records will have unspecified fields that are nullable set to Null 0.98.004 -------- change default input encoding from 'ascii' to 'utf-8' improve unicode supporto default field case is now upper FieldnameList is used to support both upper- and lower-case field checks fix memo block size changing to default when copying 0.98.003 -------- fix repr of Record and RecordTemplate fix > and >= for Date 0.98.002 -------- fix file extension handling. 0.98.001 -------- allow dbf files without .dbf ending allow any case for accessing field names 0.98.000 -------- fix resize_field to work on empty tables fix export to fixed-width by using codecs.open allow non-standard characters in field names (with warnings) 0.97.011 -------- allow '00000000' as a null date in dbf files 0.97.010 -------- fixed nullable-field access in vfp tables 0.97.009 -------- allow tzinfo addition/change in DateTime.combine() fix strptime for Time hook into datetime.datetime and .date equality comparisons 0.97.008 -------- Marshalling a DateTime now works 0.97.007 -------- add pytz support to Time and DateTime Marshalling a DateTime sends as UTC if possible 0.97.004 -------- Fix bug in VFP doubles 0.97.003 -------- Fix field creation flag handling Allow (redundant) "binary" flag on binary type fields 0.97.002 -------- Fix fp/FoxBase table header creation 0.97.001 -------- Can now open dbf files as read-only when lacking write permissions. 0.97.000 -------- Moved to a single 2/3 code base. Opening a database now defaults to dbf.READ_ONLY; can also use dbf.READ_WRITE; the corresponding strings are no longer supported. 0.96.008 -------- Handle vfp_double with decimal (Foxpro uses it for display purposes) (thanks, Joshua Adelman!) 0.96.007 -------- rev: 168 rename tests to test 0.96.006 -------- restore pql 0.96.005 -------- fix create_backup to use the active codepage, which may be different from the table's codepage 0.96.004 -------- change duplicate NUMERIC to FLOAT add Float fields to tests change from_csv to write directly to disk if to_disk requested fix erronious access to _dirty in RecordTemplate 0.96.003 -------- add 'basestring' to ver_2.py add 'basestring' to ver_2.py 0.96.000 -------- add support for Python 3 convert from module to package layout remove pql fix Time and DateTime signatures: microsec --> microseconds fix Time and DateTime .now() to truncate microseconds past thousands 0.95.014 -------- use a sparse container for the table -- should make very large dbf files usable 0.95.013 -------- Null fields properly ignored if table doesn't support it 0.95.012 -------- adjust setup.py to require enum34 add custom data types to xmlrpclib.Marshaller (marshalled as the underlying type) add support for storing numbers as scientific notation fixed DateTime.now() and Time.now() to only return milliseconds 0.95.008 -------- fix Period.__contains__ add new default_data_type to Table.__init__ of 'enhanced' which selects all the custom data types (Char, Logical, Date, DateTime) add string input type to Date, Time, and DateTime 0.95.007 -------- Add .fromfloat() to Time Add .tofloat() to Time Add Period for matching various time periods 0.95.006 -------- Add .format() and .replace() to Date, Time, and DateTime Add nesting to Table context manager Add enumerations IsoDay, RelativeDay, IsoMonth, RelativeMonth 0.95.003 -------- Fixed issue with incorrect memo size in base file (thanks, Glenn!) Memo file extensions now have the same casing as the dbf file's, and are searched for that way (thanks again, Glenn!) 0.95.002 -------- Fixed the version number in this file for the last release. :/ string slices now work for RecordTemplate 0.95.001 -------- Miscellaneous bugs squashed. backup tables are created in the same location as the original table if TMP, TEMP, nor DBF_TEMP are defined in the environment delete() and undelete() now support RecordTemplate Process() and Templates() now support start, stop, and filter to allow finer control of which records will be returned. Added Relation, which makes linking two tables together on a common field easier. Not persistent. xBase-Compatibility Break: added utf8 codepage (xf0). Backwards-Compatibility Break: reverted Logical.__eq__ to return True if Logical is True, False otherwise; this properly mimics the behavior of using True/False/None directly. If the previous behavior is desired, use Quantum instead (it uses the states On/Off/Other), or use `if some_var is Unknown: ... ; elif some_var ... ; else ... `. Many thanks to all who have helped with ideas and bug fixes. 0.94.004 -------- Templates now use same (custom) data types as table they are created from. Added Index.index_search(match, start=None, stop=None, nearest=False, partial=False) which returns the index of the first match. If nearest is False and nothing is found a NotFoundError is raised, otherwise the index of where the match would be is returned Added IndexLocation, which is a subclass of long and is returned by Index.index_search. Unlike normal numbers where 0 == False and anything else == True, IndexLocation is True if the number represents a found match, and False if the number represents where a match should be (a False value will only be returned if nearest == True). Backwards-Compatibility Break: memory-only tables are now specified with on_disk=True instead of bracketing the filename with ':'. Removed dbf.codepage() and dbf.encoding() as users can directly access dbf.default_codepage and dbf.input_decoding. Backwards-Compatibility Break: .use_deleted no longer used (it disappeared sometime between .90.0 and now). Rationale: the deleted flag is just that: a flag. The record is still present and still available. If you don't want to use it, either check if the record has been deleted (dbf.is_deleted(record)) or create an index that doesn't include the deleted records... or pack the table and actually remove the records for good. 0.94.003 -------- Minor bug fixes, more documentation. 0.94.001 -------- Added support for Clipper's large Character fields (up to 65,519) More code clean-up and slight breakage:: - _Dbf* has had '_Dbf' removed (_DbfRecord --> Record) - DbfTable --> Table (Table factory function removed) 0.94.000 -------- Massive backwards incompatible changes. export() method removed from Tables and made into a normal function. All non-underscore methods removed from the record class and made into normal functions:: - delete_record --> delete - field_names --> field_names - gather_records --> gather - has_been_deleted --> is_deleted - record_number --> recno - reset_record --> reset - scatter_records --> scatter - undelete_record --> undelete - write_record --> write Transaction methods removed entirely. Can use strings as start/stop of slices: `record['name':'age']` Record templates now exist, and are much like records except that they are not directly tied to a table and can be freely modified. They can be created by either the `dbf.create_template` function or the `table.create_template` method. scatter() now returns a RecordTemplate instead of a dict, but the as_type parameter can be used to get dicts (or tuples, lists, whatever) 0.93.020 -------- Finished changes so other Python implementations should work (PyPy definitely does). Table signature changed -- `read_only`, `meta_only`, and `keep_memos` dropped. tables now have a `status` attribute which will be one of `closed`, 'read_only`, or `read_write` `.append` no longer returns the newly added record (use table[-1] if you need it) `.find` method removed (use `.query` instead); `.sql` method removed (use `.query` instead); `.size` renamed to `.field_size`; `.type` renamed to `.field_type` (which returns a FieldType named tuple); the way to change records has changed: to update any/all fields at once: record.write_record(field1=..., field2=...) or record.gather_fields(dict) to update one field at a time: 2.6, 2.7 (2.5 using `from __future__ import with_statement`) with record: record.field1 = ... record.field2 = ... or for record in dbf.Process(table | records): record.field1 = ... record.field2 = ... attempting to change a field outside of these two methods will raise a `DbfError`. Changing behavior based on a transaction: record.gather_fields() if a transaction is not running this will write to disk (no changes made if error occurs, exception reraised) if a transaction is running, and an error occurs, the calling code is responsible for calling .rollback_transaction() or otherwise handling the problem (exception is reraised) record.reset_record() if a transaction is not running the changes are written to disk if a transaction is running the changes are not written to disk `xxx in table` and `xxx in record` used to be a field-name check - it is now a record / value check; use `xxx in table.field_names` and `xxx in record.field_names` to do the field-name check. added equality/inequality check for records, which can be compared against other records / dicts / tuples (field/key order does not matter for record-record nor record-dict checks). 0.93.011 -------- `with` will work now. Really. Started making changes so dbf will work with the non-CPython implementations (at this point it is not reliable). 0.93.010 -------- Table now returns a closed database; .open() must now be called before accessing the records. Note: fields, number of records, table type, and other metadata is available on closed tables. Finished adding support for 'F' (aka 'N') field types in dBase III tables; this is a practicality beats purity issue as the F type is not part of the db3 standard, but is exactly the same as N and other programs will use it instead of N when creating db3 tables. 0.93.000 -------- PEP 8 changes (yo --> self, someMethod --> some_method) 0.92.002 -------- added more support for the Null type in the other custome data types 0.91.001 -------- Removed __del__ from dbf records; consequently they no longer autosave when going out of scope. Either call .write_record() explicitly, or use the new Write iterator which will call .write_record for you. Finished adding Null support (not supported in db3 tables) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/__init__.py0000664000175000017500000001273414770560041014326 0ustar00ethanethan""" ========= Copyright ========= - Portions copyright: 2008-2012 Ad-Mail, Inc -- All rights reserved. - Portions copyright: 2012-2017 Ethan Furman -- All rights reserved. - Author: Ethan Furman - Contact: ethan@stoneleaf.us Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: - Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. - Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. - Neither the name of Ad-Mail, Inc nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED ''AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. """ version = 0, 99, 10 # Python 2 code may need to change these default_codepage = None # will be set by tables module (defaults to ascii) default_type = 'db3' # lowest common denominator input_decoding = 'ascii' # make dbf module importabl internally (i.e. from . import dbf) import sys as _sys dbf = _sys.modules[__package__] ## user-defined pql functions (pql == primitive query language) # it is not real sql and won't be for a long time (if ever) pql_user_functions = dict() ## signature:_meta of template records _Template_Records = dict() # from dbf.api import * class fake_module(object): def __init__(self, name, *args): self.name = name self.__all__ = [] all_objects = globals() for name in args: self.__dict__[name] = all_objects[name] self.__all__.append(name) def register(self): _sys.modules["%s.%s" % (__name__, self.name)] = self from .bridge import Decimal from .exceptions import DbfError, DataOverflowError, BadDataError from .exceptions import FieldMissingError, FieldSpecError, NonUnicodeError from .exceptions import NotFoundError, DbfWarning, Eof, Bof, DoNotIndex from .exceptions import FieldNameWarning from .constants import CLOSED, READ_ONLY, READ_WRITE, IN_MEMORY, ON_DISK from .utils import create_template, delete, field_names, is_deleted, recno from .utils import reset, source_table, undelete, write, Process, Templates from .utils import gather, scatter, scan, ensure_unicode, table_type from .utils import add_fields, delete_fields, export, from_csv from .data_types import Char, Date, DateTime, Time, Logical, Quantum, Null from .data_types import NullDate, NullDateTime, NullTime, NullType, NoneType from .data_types import Vapor, Period, On, Off, Other, Truth, Falsth, Unknown from .pql import pqlc from .tables import Table, Record, List, Index, Relation, Iter, IndexLocation from .tables import CodePage, FieldnameList, RecordTemplate from .tables import Db3Table, ClpTable, FpTable, VfpTable from .tables import RecordVaporWare api = fake_module('api', 'Table', 'Record', 'List', 'Index', 'Relation', 'Iter', 'Null', 'Char', 'Date', 'DateTime', 'Time', 'Logical', 'Quantum', 'CodePage', 'create_template', 'delete', 'field_names', 'gather', 'is_deleted', 'recno', 'source_table', 'reset', 'scatter', 'scan', 'undelete', 'write', 'export', 'pqlc', 'from_csv', 'NullDate', 'NullDateTime', 'NullTime', 'NoneType', 'NullType', 'Decimal', 'Vapor', 'Period', 'Truth', 'Falsth', 'Unknown', 'On', 'Off', 'Other', 'table_type', 'DbfError', 'DataOverflowError', 'BadDataError', 'FieldMissingError', 'FieldSpecError', 'NonUnicodeError', 'NotFoundError', 'DbfWarning', 'Eof', 'Bof', 'DoNotIndex', 'FieldNameWarning', 'IndexLocation', 'Process', 'Templates', 'CLOSED', 'READ_ONLY', 'READ_WRITE', ) api.register() __all__ = ( 'Decimal', 'DbfError', 'DataOverflowError', 'BadDataError', 'FieldMissingError', 'FieldSpecError', 'NonUnicodeError', 'NotFoundError', 'DbfWarning', 'Eof', 'Bof', 'DoNotIndex', 'FieldNameWarning', 'CLOSED', 'READ_ONLY', 'READ_WRITE', 'IN_MEMORY', 'ON_DISK', 'create_template', 'delete', 'field_names', 'is_deleted', 'recno', 'reset', 'source_table', 'undelete', 'write', 'Process', 'Templates', 'gather', 'scatter', 'scan', 'ensure_unicode', 'table_type', 'add_fields', 'delete_fields', 'export', 'from_csv', 'Char', 'Date', 'DateTime', 'Time', 'Logical', 'Quantum', 'Null', 'NullDate', 'NullDateTime', 'NullTime', 'NullType', 'NoneType', 'Vapor', 'Period', 'On', 'Off', 'Other', 'Truth', 'Falsth', 'Unknown', 'pqlc', 'Table', 'Record', 'List', 'Index', 'Relation', 'Iter', 'IndexLocation', 'CodePage', 'FieldnameList', 'RecordTemplate', 'Db3Table', 'ClpTable', 'FpTable', 'VfpTable', 'RecordVaporWare', ) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/_index.py0000664000175000017500000001676514770560041014045 0ustar00ethanethanclass IndexFile(object): "provides read/write access to a custom index file." _last_header_block = 0 # block to write new master index entries _free_node_chain = 0 # beginning of free node chain indices = None # indices in this file def __init__(self, dbf): "creates .pdx file if it doesn't exist" filename = os.path.splitext(dbf.filename)[0] filename += '.pdx' if not os.path.exists(filename): self.index_file = open(filename, 'r+b') self.index_file.write('\xea\xaf\x37\xbf' # signature '\x00'*8 # two non-existant lists '\x00'*500) # and no indices return index_file = self.index_file = open(filename, 'r+b') header = index_file.read(512) if header[:4] != '\xea\xaf\x37\xbf': raise IndexFileError("Wrong signature -- unable to use index file %r" % filename) more_header = unpack_long_int(header[4:8]) free_nodes = self.free_nodes = unpack_long_int(header[8:12]) indices = header[12:] while more_header: self.last_header_block = more_header # block to add new indices to index_file.seek(more_header) header = index_file.read(512) more_header = unpack_long_int(header[:4]) indices += header[4:] class ContainedIndex(_Navigation): "an individual index in a .pdx (plentiful index) file" def __init__(self, table, key_func, key_text, index_file, root_node, id): self.__doc__ = key_text self._meta = table._meta # keep for other info functions self.key = key_func self.file = index_file self.root = root_node self.id = id def __call__(self, record): rec_num = recno(record) key = self.key(record) if not isinstance(key, tuple): key = (key, ) if rec_num in self._records: if self._records[rec_num] == key: return old_key = self._records[rec_num] vindex = bisect_left(self._values, old_key) self._values.pop(vindex) self._rec_by_val.pop(vindex) del self._records[rec_num] assert rec_num not in self._records if key == (DoNotIndex, ): return vindex = bisect_right(self._values, key) self._values.insert(vindex, key) self._rec_by_val.insert(vindex, rec_num) self._records[rec_num] = key def __contains__(self, data): if not isinstance(data,(Record, RecordTemplate, tuple, dict)): raise TypeError("%r is not a record, templace, tuple, nor dict" % (data, )) if isinstance(data, Record) and source_table(data) is self._table: return recno(data) in self._records else: try: value = self.key(data) return value in self._values except Exception: for record in self: if record == data: return True return False def __getitem__(self, key): if isinstance(key, int): count = len(self._values) if not -count <= key < count: raise IndexError("Record %d is not in list." % key) rec_num = self._rec_by_val[key] return self._table[rec_num] elif isinstance(key, slice): result = List() start, stop, step = key.start, key.stop, key.step if start is None: start = 0 if stop is None: stop = len(self._rec_by_val) if step is None: step = 1 if step < 0: start, stop = stop - 1, -(stop - start + 1) for loc in range(start, stop, step): record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) return result elif isinstance (key, (str, unicode, tuple, Record)): if isinstance(key, Record): key = self.key(key) elif not isinstance(key, tuple): key = (key, ) loc = self.find(key) if loc == -1: raise KeyError(key) return self._table[self._rec_by_val[loc]] else: raise TypeError('indices must be integers, match objects must by strings or tuples') def __enter__(self): self._table.open() return self def __exit__(self, *exc_info): self._table.close() return False def __iter__(self): return Iter(self) def __len__(self): return len(self._records) def _clear(self): "removes all entries from index" self._values[:] = [] self._rec_by_val[:] = [] self._records.clear() def _nav_check(self): "raises error if table is closed" if self._table._meta.status == CLOSED: raise DbfError('indexed table %s is closed' % self.filename) def _partial_match(self, target, match): target = target[:len(match)] if isinstance(match[-1], (str, unicode)): target = list(target) target[-1] = target[-1][:len(match[-1])] target = tuple(target) return target == match def _purge(self, rec_num): value = self._records.get(rec_num) if value is not None: vindex = bisect_left(self._values, value) del self._records[rec_num] self._values.pop(vindex) self._rec_by_val.pop(vindex) def _reindex(self): "reindexes all records" for record in self._table: self(record) def _search(self, match, lo=0, hi=None): if hi is None: hi = len(self._values) return bisect_left(self._values, match, lo, hi) def index(self, record, start=None, stop=None): """returns the index of record between start and stop start and stop default to the first and last record""" if not isinstance(record, (Record, RecordTemplate, dict, tuple)): raise TypeError("x should be a record, template, dict, or tuple, not %r" % type(record)) self._still_valid_check() if start is None: start = 0 if stop is None: stop = len(self) for i in range(start, stop): if record == (self[i]): return i else: raise NotFoundError("dbf.Index.index(x): x not in Index", data=record) def query(self, criteria): """criteria is a callback that returns a truthy value for matching record""" return pql(self, criteria) def search(self, match, partial=False): "returns dbf.List of all (partially) matching records" result = List() if not isinstance(match, tuple): match = (match, ) loc = self._search(match) if loc == len(self._values): return result while loc < len(self._values) and self._values[loc] == match: record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) loc += 1 if partial: while loc < len(self._values) and self._partial_match(self._values[loc], match): record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) loc += 1 return result ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/bridge.py0000664000175000017500000000412714770560041014020 0ustar00ethanethanfrom array import array from collections import deque from textwrap import dedent from decimal import Decimal import sys __all__ = [ 'bytes', 'str', 'unicode', 'basestring', 'int', 'long', 'baseinteger', 'Decimal', 'builtins', 'execute', 'ord', 'to_bytes', 'py_ver', ] py_ver = sys.version_info[:2] if py_ver < (3, 0): bytes = str str = unicode unicode = unicode basestring = bytes, unicode int = int long = long baseinteger = int, long import __builtin__ as builtins else: bytes = bytes str = str unicode = str basestring = unicode, int = int long = int baseinteger = int, xrange = range import builtins bi_ord = builtins.ord def ord(int_or_char): if isinstance(int_or_char, baseinteger): return int_or_char else: return bi_ord(int_or_char) ## keep pyflakes happy :( execute = None if py_ver < (3, 0): exec(dedent("""\ def execute(code, gbl=None, lcl=None): if lcl is not None: exec code in gbl, lcl elif gbl is not None: exec code in gbl else: exec code in globals() """)) def to_bytes(data): try: if not data: return b'' elif isinstance(data, bytes): return data elif isinstance(data, baseinteger): return chr(data).encode('ascii') elif isinstance(data[0], bytes): return b''.join(data) elif not isinstance(data, array): data = array('B', data) return data.tostring() except Exception: raise ValueError('unable to convert %r to bytes' % (data, )) else: exec(dedent("""\ def execute(code, gbl=None, lcl=None): exec(code, gbl, lcl) """)) def to_bytes(data): if isinstance(data, baseinteger): return chr(data).encode('ascii') elif isinstance(data, array): return data.tobytes() else: return bytes(data) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/constants.py0000664000175000017500000001726514770560041014607 0ustar00ethanethanfrom aenum import Enum as _Enum, IntEnum as _IntEnum, IntFlag as _IntFlag, export as _export from array import array as _array from .bridge import * ## keep pyflakes happy :( SYSTEM = NULLABLE = BINARY = NOCPTRANS = None SPACE = ASTERISK = TYPE = CR = NULL = None START = LENGTH = END = DECIMALS = FLAGS = CLASS = EMPTY = NUL = None IN_MEMORY = ON_DISK = CLOSED = READ_ONLY = READ_WRITE = None _NULLFLAG = CHAR = CURRENCY = DATE = DATETIME = DOUBLE = FLOAT = TIMESTAMP = None GENERAL = INTEGER = LOGICAL = MEMO = NUMERIC = PICTURE = None _module = globals() class _HexEnum(_IntEnum): "repr is in hex" def __repr__(self): return '<%s.%s: %#02x>' % ( self.__class__.__name__, self._name_, self._value_, ) class _ZeroEnum(_IntEnum): """ Automatically numbers enum members starting from 0. Includes support for a custom docstring per member. """ _init_ = 'value doc' _start_ = 0 class IsoDay(_IntEnum): MONDAY = 1 TUESDAY = 2 WEDNESDAY = 3 THURSDAY = 4 FRIDAY = 5 SATURDAY = 6 SUNDAY = 7 def next_delta(self, day): """Return number of days needed to get from self to day.""" if self == day: return 7 delta = day - self if delta < 0: delta += 7 return delta def last_delta(self, day): """Return number of days needed to get from self to day.""" if self == day: return -7 delta = day - self if delta > 0: delta -= 7 return delta @_export(_module) class RelativeDay(_Enum): LAST_SUNDAY = () LAST_SATURDAY = () LAST_FRIDAY = () LAST_THURSDAY = () LAST_WEDNESDAY = () LAST_TUESDAY = () LAST_MONDAY = () NEXT_MONDAY = () NEXT_TUESDAY = () NEXT_WEDNESDAY = () NEXT_THURSDAY = () NEXT_FRIDAY = () NEXT_SATURDAY = () NEXT_SUNDAY = () def __new__(cls): result = object.__new__(cls) result._value = len(cls.__members__) + 1 return result def days_from(self, day): target = IsoDay[self.name[5:]] if self.name[:4] == 'LAST': return day.last_delta(target) return day.next_delta(target) class IsoMonth(_IntEnum): JANUARY = 1 FEBRUARY = 2 MARCH = 3 APRIL = 4 MAY = 5 JUNE = 6 JULY = 7 AUGUST = 8 SEPTEMBER = 9 OCTOBER = 10 NOVEMBER = 11 DECEMBER = 12 def next_delta(self, month): """Return number of months needed to get from self to month.""" if self == month: return 12 delta = month - self if delta < 0: delta += 12 return delta def last_delta(self, month): """Return number of months needed to get from self to month.""" if self == month: return -12 delta = month - self if delta > 0: delta -= 12 return delta @_export(_module) class RelativeMonth(_Enum): LAST_DECEMBER = () LAST_NOVEMBER = () LAST_OCTOBER = () LAST_SEPTEMBER = () LAST_AUGUST = () LAST_JULY = () LAST_JUNE = () LAST_MAY = () LAST_APRIL = () LAST_MARCH= () LAST_FEBRUARY = () LAST_JANUARY = () NEXT_JANUARY = () NEXT_FEBRUARY = () NEXT_MARCH = () NEXT_APRIL = () NEXT_MAY = () NEXT_JUNE = () NEXT_JULY = () NEXT_AUGUST = () NEXT_SEPTEMBER = () NEXT_OCTOBER = () NEXT_NOVEMBER = () NEXT_DECEMBER = () def __new__(cls): result = object.__new__(cls) result._value = len(cls.__members__) + 1 return result def months_from(self, month): target = IsoMonth[self.name[5:]] if self.name[:4] == 'LAST': return month.last_delta(target) return month.next_delta(target) # Constants @_export(_module) class LatinByte(_HexEnum): NULL = 0x00 SOH = 0x01 STX = 0x02 ETX = 0x03 EOT = 0x04 ENQ = 0x05 ACK = 0x06 BEL = 0x07 BS = 0x08 TAB = 0x09 LF = 0x0a VT = 0x0b FF = 0x0c CR = 0x0d SO = 0x0e SI = 0x0f DLE = 0x10 DC1 = 0x11 DC2 = 0x12 DC3 = 0x13 DC4 = 0x14 NAK = 0x15 SYN = 0x16 ETB = 0x17 CAN = 0x18 EM = 0x19 EOF = 0x1a SUB = 0x1a ESC = 0x1b FS = 0x1c GS = 0x1d RS = 0x1e US = 0x1f SPACE = 0x20 ASTERISK = 0x2a def __new__(cls, byte): obj = int.__new__(cls, byte) obj._value_ = byte obj.byte = chr(byte).encode('latin-1') obj.array = _array('B', [byte]) return obj def __repr__(self): return ( '<%s.%s: %#02x>' %( self.__class__.__name__, self._name_, self._value_) ) def __add__(self, other): if isinstance(other, bytes): return self.byte + other elif isinstance(other, _array): return self.array + other else: return super(LatinByte, self).__add__(other) def __radd__(self, other): if isinstance(other, bytes): return other + self.byte elif isinstance(other, _array): return other + self.array else: return super(LatinByte, self).__add__(other) @_export(_module) class FieldType(_IntEnum): def __new__(cls, char): char = char.upper() uchar = char.decode('ascii') int_value = ord(char) obj = int.__new__(cls, int_value) obj._value_ = int_value obj.symbol = uchar for alias in ( char.lower(), char.upper(), ): cls._value2member_map_[alias] = obj cls._value2member_map_[alias.decode('ascii')] = obj return obj def __repr__(self): return '<%s.%s: %r>' % ( self.__class__.__name__, self._name_, to_bytes([self._value_]), ) _NULLFLAG = b'0' CHAR = b'C' CURRENCY = b'Y' DATE = b'D' DATETIME = b'T' DOUBLE = b'B' FLOAT = b'F' GENERAL = b'G' INTEGER = b'I' LOGICAL = b'L' MEMO = b'M' NUMERIC = b'N' PICTURE = b'P' TIMESTAMP = b'@' @_export(_module) class FieldFlag(_IntFlag): @classmethod def lookup(cls, alias): alias = alias.lower() if alias in ('system', ): return cls.SYSTEM elif alias in ('null', 'nullable'): return cls.NULLABLE elif alias in ('binary', 'nocptrans'): return cls.BINARY else: raise ValueError('no FieldFlag %r' % alias) @property def text(self): if self is NULLABLE: return 'null' else: return self._name_.lower() SYSTEM = 0x01 NULLABLE = 0x02 BINARY = 0x04 NOCPTRANS = 0x04 #AUTOINC = 0x0c # not currently supported (not vfp 6) @_export(_module) class Field(_ZeroEnum): __order__ = 'TYPE START LENGTH END DECIMALS FLAGS CLASS EMPTY NUL' TYPE = "Char, Date, Logical, etc." START = "Field offset in record" LENGTH = "Length of field in record" END = "End of field in record (exclusive)" DECIMALS = "Number of decimal places if numeric" FLAGS = "System, Binary, Nullable" CLASS = "python class type" EMPTY = "python function for empty field" NUL = "python function for null field" @_export(_module) class DbfLocation(_ZeroEnum): __order__ = 'IN_MEMORY ON_DISK' IN_MEMORY = "dbf is kept in memory (disappears at program end)" ON_DISK = "dbf is kept on disk" @_export(_module) class DbfStatus(_ZeroEnum): __order__ = 'CLOSED READ_ONLY READ_WRITE' CLOSED = 'closed (only meta information available)' READ_ONLY = 'read-only' READ_WRITE = 'read-write' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/data_types.py0000664000175000017500000016571614770560041014735 0ustar00ethanethanfrom math import floor import datetime import time from .bridge import * from .constants import * from .utils import string, is_leapyear try: import pytz except ImportError: pytz = None NoneType = type(None) ## dec jan feb mar apr may jun jul aug sep oct nov dec jan days_per_month = [31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31] days_per_leap_month = [31, 31, 29, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31] def days_in_month(year): return (days_per_month, days_per_leap_month)[is_leapyear(year)] # gets replaced later by their final values Unknown = Other = object() class NullType(object): """ Null object -- any interaction returns Null """ def _null(self, *args, **kwargs): return self __eq__ = __ne__ = __ge__ = __gt__ = __le__ = __lt__ = _null __add__ = __iadd__ = __radd__ = _null __sub__ = __isub__ = __rsub__ = _null __mul__ = __imul__ = __rmul__ = _null __div__ = __idiv__ = __rdiv__ = _null __mod__ = __imod__ = __rmod__ = _null __pow__ = __ipow__ = __rpow__ = _null __and__ = __iand__ = __rand__ = _null __xor__ = __ixor__ = __rxor__ = _null __or__ = __ior__ = __ror__ = _null __truediv__ = __itruediv__ = __rtruediv__ = _null __floordiv__ = __ifloordiv__ = __rfloordiv__ = _null __lshift__ = __ilshift__ = __rlshift__ = _null __rshift__ = __irshift__ = __rrshift__ = _null __neg__ = __pos__ = __abs__ = __invert__ = _null __call__ = __getattr__ = _null def __divmod__(self, other): return self, self __rdivmod__ = __divmod__ __hash__ = None def __new__(cls, *args): return cls.null if py_ver < (3, 0): def __nonzero__(self): return False else: def __bool__(self): return False def __repr__(self): return '' def __setattr__(self, name, value): return None def __setitem___(self, index, value): return None def __str__(self): return '' NullType.null = object.__new__(NullType) Null = NullType() class Vapor(object): """ used in Vapor Records -- compares unequal with everything """ def __eq__(self, other): return False def __ne__(self, other): return True if py_ver < (3, 0): def __nonzero__(self): """ Vapor objects are always False """ return False else: def __bool__(self): """ Vapor objects are always False """ return False Vapor = Vapor() class Char(unicode): """ Strips trailing whitespace, and ignores trailing whitespace for comparisons """ def __new__(cls, text=''): if not isinstance(text, (basestring, cls)): raise ValueError("Unable to automatically coerce %r to Char" % text) result = unicode.__new__(cls, text.rstrip()) result.field_size = len(text) return result __hash__ = unicode.__hash__ def __eq__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) == other.rstrip() def __ge__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) >= other.rstrip() def __gt__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) > other.rstrip() def __le__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) <= other.rstrip() def __lt__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) < other.rstrip() def __ne__(self, other): """ ignores trailing whitespace """ if not isinstance(other, (self.__class__, basestring)): return NotImplemented return unicode(self) != other.rstrip() if py_ver < (3, 0): def __nonzero__(self): """ ignores trailing whitespace """ return bool(unicode(self)) else: def __bool__(self): """ ignores trailing whitespace """ return bool(unicode(self)) def __add__(self, other): result = self.__class__(unicode(self) + other) result.field_size = self.field_size return result from . import bridge basestring = bridge.basestring = bridge.basestring + (Char, ) baseinteger = bridge.baseinteger # wrappers around datetime and logical objects to allow null values class Date(object): """ adds null capable datetime.date constructs """ __slots__ = ['_date'] def __new__(cls, year=None, month=0, day=0): """ date should be either a datetime.date or date/month/day should all be appropriate integers """ if year is None or year is Null: return cls._null_date nd = object.__new__(cls) if isinstance(year, basestring): return Date.strptime(year) elif isinstance(year, (datetime.date)): nd._date = year elif isinstance(year, (Date)): nd._date = year._date else: nd._date = datetime.date(year, month, day) return nd def __add__(self, other): if self and isinstance(other, (datetime.timedelta)): return Date(self._date + other) else: return NotImplemented def __eq__(self, other): if isinstance(other, self.__class__): return self._date == other._date if isinstance(other, datetime.date): return self._date == other if isinstance(other, type(None)): return self._date is None return NotImplemented def __format__(self, spec): if self: return self._date.__format__(spec) return '' def __getattr__(self, name): if name == '_date': raise AttributeError('_date missing!') elif self: return getattr(self._date, name) else: raise AttributeError('NullDate object has no attribute %s' % name) def __ge__(self, other): if self: if isinstance(other, (datetime.date)): return self._date >= other elif isinstance(other, (Date)): if other: return self._date >= other._date return True else: if isinstance(other, (datetime.date)): return False elif isinstance(other, (Date)): if other: return False return True return NotImplemented def __gt__(self, other): if self: if isinstance(other, (datetime.date)): return self._date > other elif isinstance(other, (Date)): if other: return self._date > other._date return True else: if isinstance(other, (datetime.date)): return False elif isinstance(other, (Date)): if other: return False return False return NotImplemented def __hash__(self): return hash(self._date) def __le__(self, other): if self: if isinstance(other, (datetime.date)): return self._date <= other elif isinstance(other, (Date)): if other: return self._date <= other._date return False else: if isinstance(other, (datetime.date)): return True elif isinstance(other, (Date)): if other: return True return True return NotImplemented def __lt__(self, other): if self: if isinstance(other, (datetime.date)): return self._date < other elif isinstance(other, (Date)): if other: return self._date < other._date return False else: if isinstance(other, (datetime.date)): return True elif isinstance(other, (Date)): if other: return True return False return NotImplemented def __ne__(self, other): if self: if isinstance(other, (datetime.date)): return self._date != other elif isinstance(other, (Date)): if other: return self._date != other._date return True else: if isinstance(other, (datetime.date)): return True elif isinstance(other, (Date)): if other: return True return False return NotImplemented if py_ver < (3, 0): def __nonzero__(self): return self._date is not None else: def __bool__(self): return self._date is not None __radd__ = __add__ def __rsub__(self, other): if self and isinstance(other, (datetime.date)): return other - self._date elif self and isinstance(other, (Date)): return other._date - self._date elif self and isinstance(other, (datetime.timedelta)): return Date(other - self._date) else: return NotImplemented def __repr__(self): if self: return "Date(%d, %d, %d)" % self.timetuple()[:3] else: return "Date()" def __str__(self): if self: return unicode(self._date) return "" def __sub__(self, other): if self and isinstance(other, (datetime.date)): return self._date - other elif self and isinstance(other, (Date)): return self._date - other._date elif self and isinstance(other, (datetime.timedelta)): return Date(self._date - other) else: return NotImplemented def date(self): if self: return self._date return None @classmethod def fromordinal(cls, number): if number: return cls(datetime.date.fromordinal(number)) return cls() @classmethod def fromtimestamp(cls, timestamp): return cls(datetime.date.fromtimestamp(timestamp)) @classmethod def fromymd(cls, yyyymmdd): if yyyymmdd in ('', ' ', 'no date', '00000000'): return cls() return cls(datetime.date(int(yyyymmdd[:4]), int(yyyymmdd[4:6]), int(yyyymmdd[6:]))) def replace(self, year=None, month=None, day=None, delta_year=0, delta_month=0, delta_day=0): if not self: return self.__class__._null_date old_year, old_month, old_day = self.timetuple()[:3] if isinstance(month, RelativeMonth): this_month = IsoMonth(old_month) delta_month += month.months_from(this_month) month = None if isinstance(day, RelativeDay): this_day = IsoDay(self.isoweekday()) delta_day += day.days_from(this_day) day = None year = (year or old_year) + delta_year month = (month or old_month) + delta_month day = (day or old_day) + delta_day days_in_month = (days_per_month, days_per_leap_month)[is_leapyear(year)] while not(0 < month < 13) or not (0 < day <= days_in_month[month]): while month < 1: year -= 1 month = 12 + month while month > 12: year += 1 month = month - 12 days_in_month = (days_per_month, days_per_leap_month)[is_leapyear(year)] while day < 1: month -= 1 day = days_in_month[month] + day if not 0 < month < 13: break while day > days_in_month[month]: day = day - days_in_month[month] month += 1 if not 0 < month < 13: break return Date(year, month, day) def strftime(self, format): fmt_cls = type(format) if self: return fmt_cls(self._date.strftime(format)) return fmt_cls('') @classmethod def strptime(cls, date_string, format=None): if format is not None: return cls(*(time.strptime(date_string, format)[0:3])) return cls(*(time.strptime(date_string, "%Y-%m-%d")[0:3])) def timetuple(self): return self._date.timetuple() @classmethod def today(cls): return cls(datetime.date.today()) def ymd(self): if self: return "%04d%02d%02d" % self.timetuple()[:3] else: return ' ' Date.max = Date(datetime.date.max) Date.min = Date(datetime.date.min) Date._null_date = object.__new__(Date) Date._null_date._date = None NullDate = Date() class DateTime(object): """ adds null capable datetime.datetime constructs """ __slots__ = ['_datetime'] def __new__(cls, year=None, month=0, day=0, hour=0, minute=0, second=0, microsecond=0, tzinfo=Null): """year may be a datetime.datetime""" if year is None or year is Null: return cls._null_datetime ndt = object.__new__(cls) if isinstance(year, basestring): return DateTime.strptime(year) elif isinstance(year, DateTime): if tzinfo is not Null and year._datetime.tzinfo: raise ValueError('not naive datetime (tzinfo is already set)') elif tzinfo is Null: tzinfo = None ndt._datetime = year._datetime elif isinstance(year, datetime.datetime): if tzinfo is not Null and year.tzinfo: raise ValueError('not naive datetime (tzinfo is already set)') elif tzinfo is Null: tzinfo = year.tzinfo microsecond = year.microsecond // 1000 * 1000 hour, minute, second = year.hour, year.minute, year.second year, month, day = year.year, year.month, year.day if pytz is None or tzinfo is None: ndt._datetime = datetime.datetime(year, month, day, hour, minute, second, microsecond, tzinfo) else: # if pytz and tzinfo, tzinfo must be added after creation _datetime = datetime.datetime(year, month, day, hour, minute, second, microsecond) ndt._datetime = tzinfo.normalize(tzinfo.localize(_datetime)) elif year is not None: if tzinfo is Null: tzinfo = None microsecond = microsecond // 1000 * 1000 if pytz is None or tzinfo is None: ndt._datetime = datetime.datetime(year, month, day, hour, minute, second, microsecond, tzinfo) else: # if pytz and tzinfo, tzinfo must be added after creation _datetime = datetime.datetime(year, month, day, hour, minute, second, microsecond) ndt._datetime = tzinfo.normalize(tzinfo.localize(_datetime)) return ndt def __add__(self, other): if self and isinstance(other, (datetime.timedelta)): return DateTime(self._datetime + other) else: return NotImplemented def __eq__(self, other): if isinstance(other, self.__class__): return self._datetime == other._datetime if isinstance(other, datetime.date): me = self._datetime.timetuple() them = other.timetuple() return me[:6] == them[:6] and self.microsecond == (other.microsecond//1000*1000) if isinstance(other, type(None)): return self._datetime is None return NotImplemented def __format__(self, spec): if self: return self._datetime.__format__(spec) return '' def __getattr__(self, name): if name == '_datetime': raise AttributeError('_datetime missing!') elif self: return getattr(self._datetime, name) else: raise AttributeError('NullDateTime object has no attribute %s' % name) def __ge__(self, other): if self: if isinstance(other, (datetime.datetime)): return self._datetime >= other elif isinstance(other, (DateTime)): if other: return self._datetime >= other._datetime return False else: if isinstance(other, (datetime.datetime)): return False elif isinstance(other, (DateTime)): if other: return False return True return NotImplemented def __gt__(self, other): if self: if isinstance(other, (datetime.datetime)): return self._datetime > other elif isinstance(other, (DateTime)): if other: return self._datetime > other._datetime return True else: if isinstance(other, (datetime.datetime)): return False elif isinstance(other, (DateTime)): if other: return False return False return NotImplemented def __hash__(self): return self._datetime.__hash__() def __le__(self, other): if self: if isinstance(other, (datetime.datetime)): return self._datetime <= other elif isinstance(other, (DateTime)): if other: return self._datetime <= other._datetime return False else: if isinstance(other, (datetime.datetime)): return True elif isinstance(other, (DateTime)): if other: return True return True return NotImplemented def __lt__(self, other): if self: if isinstance(other, (datetime.datetime)): return self._datetime < other elif isinstance(other, (DateTime)): if other: return self._datetime < other._datetime return False else: if isinstance(other, (datetime.datetime)): return True elif isinstance(other, (DateTime)): if other: return True return False return NotImplemented def __ne__(self, other): if self: if isinstance(other, (datetime.datetime)): return self._datetime != other elif isinstance(other, (DateTime)): if other: return self._datetime != other._datetime return True else: if isinstance(other, (datetime.datetime)): return True elif isinstance(other, (DateTime)): if other: return True return False return NotImplemented if py_ver < (3, 0): def __nonzero__(self): return self._datetime is not None else: def __bool__(self): return self._datetime is not None __radd__ = __add__ def __rsub__(self, other): if self and isinstance(other, (datetime.datetime)): return other - self._datetime elif self and isinstance(other, (DateTime)): return other._datetime - self._datetime elif self and isinstance(other, (datetime.timedelta)): return DateTime(other - self._datetime) else: return NotImplemented def __repr__(self): if self: if self.tzinfo is None: tz = '' else: diff = self._datetime.utcoffset() hours, minutes = divmod(diff.days * 86400 + diff.seconds, 3600) minus, hours = hours < 0, abs(hours) tz = ', tzinfo=<%s %s%02d%02d>' % (self._datetime.tzname(), ('','-')[minus], hours, minutes) return "DateTime(%d, %d, %d, %d, %d, %d, %d%s)" % ( self._datetime.timetuple()[:6] + (self._datetime.microsecond, tz) ) else: return "DateTime()" def __str__(self): if self: return unicode(self._datetime) return "" def __sub__(self, other): if self and isinstance(other, (datetime.datetime)): return self._datetime - other elif self and isinstance(other, (DateTime)): return self._datetime - other._datetime elif self and isinstance(other, (datetime.timedelta)): return DateTime(self._datetime - other) else: return NotImplemented @classmethod def combine(cls, date, time, tzinfo=Null): # if tzinfo is given, timezone is added/stripped if tzinfo is Null: tzinfo = time.tzinfo if Date(date) and Time(time): return cls( date.year, date.month, date.day, time.hour, time.minute, time.second, time.microsecond, tzinfo=tzinfo, ) return cls() def date(self): if self: return Date(self.year, self.month, self.day) return Date() def datetime(self): if self: return self._datetime return None @classmethod def fromordinal(cls, number): if number: return cls(datetime.datetime.fromordinal(number)) else: return cls() @classmethod def fromtimestamp(cls, timestamp): return DateTime(datetime.datetime.fromtimestamp(timestamp)) @classmethod def now(cls, tzinfo=None): "only accurate to milliseconds" return cls(datetime.datetime.now(), tzinfo=tzinfo) def replace(self, year=None, month=None, day=None, hour=None, minute=None, second=None, microsecond=None, tzinfo=Null, delta_year=0, delta_month=0, delta_day=0, delta_hour=0, delta_minute=0, delta_second=0): if not self: return self.__class__._null_datetime old_year, old_month, old_day, old_hour, old_minute, old_second, old_micro = self.timetuple()[:7] if tzinfo is Null: tzinfo = self._datetime.tzinfo if isinstance(month, RelativeMonth): this_month = IsoMonth(old_month) delta_month += month.months_from(this_month) month = None if isinstance(day, RelativeDay): this_day = IsoDay(self.isoweekday()) delta_day += day.days_from(this_day) day = None year = (year or old_year) + delta_year month = (month or old_month) + delta_month day = (day or old_day) + delta_day hour = (hour or old_hour) + delta_hour minute = (minute or old_minute) + delta_minute second = (second or old_second) + delta_second microsecond = microsecond or old_micro days_in_month = (days_per_month, days_per_leap_month)[is_leapyear(year)] while ( not (0 < month < 13) or not (0 < day <= days_in_month[month]) or not (0 <= hour < 24) or not (0 <= minute < 60) or not (0 <= second < 60) ): while month < 1: year -= 1 month = 12 + month while month > 12: year += 1 month = month - 12 days_in_month = (days_per_month, days_per_leap_month)[is_leapyear(year)] while day < 1: month -= 1 day = days_in_month[month] + day if not 0 < month < 13: break while day > days_in_month[month]: day = day - days_in_month[month] month += 1 if not 0 < month < 13: break while hour < 1: day -= 1 hour = 24 + hour while hour > 23: day += 1 hour = hour - 24 while minute < 0: hour -= 1 minute = 60 + minute while minute > 59: hour += 1 minute = minute - 60 while second < 0: minute -= 1 second = 60 + second while second > 59: minute += 1 second = second - 60 return DateTime(year, month, day, hour, minute, second, microsecond, tzinfo) def strftime(self, format): fmt_cls = type(format) if self: return fmt_cls(self._datetime.strftime(format)) return fmt_cls('') @classmethod def strptime(cls, datetime_string, format=None): if format is not None: return cls(datetime.datetime.strptime(datetime_string, format)) for format in ( "%Y-%m-%d %H:%M:%S.%f", "%Y-%m-%d %H:%M:%S", ): try: return cls(datetime.datetime.strptime(datetime_string, format)) except ValueError: pass raise ValueError("Unable to convert %r" % datetime_string) def time(self): if self: return Time(self.hour, self.minute, self.second, self.microsecond) return Time() def timetuple(self): return self._datetime.timetuple() def timetz(self): if self: return Time(self._datetime.timetz()) return Time() @classmethod def utcnow(cls): return cls(datetime.datetime.utcnow()) @classmethod def today(cls): return cls(datetime.datetime.today()) DateTime.max = DateTime(datetime.datetime.max) DateTime.min = DateTime(datetime.datetime.min) DateTime._null_datetime = object.__new__(DateTime) DateTime._null_datetime._datetime = None NullDateTime = DateTime() class Time(object): """ adds null capable datetime.time constructs """ __slots__ = ['_time'] def __new__(cls, hour=None, minute=0, second=0, microsecond=0, tzinfo=Null): """ hour may be a datetime.time or a str(Time) """ if hour is None or hour is Null: return cls._null_time nt = object.__new__(cls) if isinstance(hour, basestring): hour = Time.strptime(hour) if isinstance(hour, Time): if tzinfo is not Null and hour._time.tzinfo: raise ValueError('not naive time (tzinfo is already set)') elif tzinfo is Null: tzinfo = None nt._time = hour._time.replace(tzinfo=tzinfo) elif isinstance(hour, (datetime.time)): if tzinfo is not Null and hour.tzinfo: raise ValueError('not naive time (tzinfo is already set)') if tzinfo is Null: tzinfo = hour.tzinfo microsecond = hour.microsecond // 1000 * 1000 hour, minute, second = hour.hour, hour.minute, hour.second nt._time = datetime.time(hour, minute, second, microsecond, tzinfo) elif hour is not None: if tzinfo is Null: tzinfo = None microsecond = microsecond // 1000 * 1000 nt._time = datetime.time(hour, minute, second, microsecond, tzinfo) return nt def __add__(self, other): if self and isinstance(other, (datetime.timedelta)): t = self._time t = datetime.datetime(2012, 6, 27, t.hour, t.minute, t.second, t.microsecond) t += other return Time(t.hour, t.minute, t.second, t.microsecond) else: return NotImplemented def __eq__(self, other): if isinstance(other, self.__class__): return self._time == other._time if isinstance(other, datetime.time): return ( self.hour == other.hour and self.minute == other.minute and self.second == other.second and self.microsecond == (other.microsecond//1000*1000) ) if isinstance(other, type(None)): return self._time is None return NotImplemented def __format__(self, spec): if self: return self._time.__format__(spec) return '' def __getattr__(self, name): if name == '_time': raise AttributeError('_time missing!') elif self: return getattr(self._time, name) else: raise AttributeError('NullTime object has no attribute %s' % name) def __ge__(self, other): if self: if isinstance(other, (datetime.time)): return self._time >= other elif isinstance(other, (Time)): if other: return self._time >= other._time return False else: if isinstance(other, (datetime.time)): return False elif isinstance(other, (Time)): if other: return False return True return NotImplemented def __gt__(self, other): if self: if isinstance(other, (datetime.time)): return self._time > other elif isinstance(other, (DateTime)): if other: return self._time > other._time return True else: if isinstance(other, (datetime.time)): return False elif isinstance(other, (Time)): if other: return False return False return NotImplemented def __hash__(self): return self._datetime.__hash__() def __le__(self, other): if self: if isinstance(other, (datetime.time)): return self._time <= other elif isinstance(other, (Time)): if other: return self._time <= other._time return False else: if isinstance(other, (datetime.time)): return True elif isinstance(other, (Time)): if other: return True return True return NotImplemented def __lt__(self, other): if self: if isinstance(other, (datetime.time)): return self._time < other elif isinstance(other, (Time)): if other: return self._time < other._time return False else: if isinstance(other, (datetime.time)): return True elif isinstance(other, (Time)): if other: return True return False return NotImplemented def __ne__(self, other): if self: if isinstance(other, (datetime.time)): return self._time != other elif isinstance(other, (Time)): if other: return self._time != other._time return True else: if isinstance(other, (datetime.time)): return True elif isinstance(other, (Time)): if other: return True return False return NotImplemented if py_ver < (3, 0): def __nonzero__(self): return self._time is not None else: def __bool__(self): return self._time is not None __radd__ = __add__ def __rsub__(self, other): if self and isinstance(other, (Time, datetime.time)): t = self._time t = datetime.datetime(2012, 6, 27, t.hour, t.minute, t.second, t.microsecond) other = datetime.datetime(2012, 6, 27, other.hour, other.minute, other.second, other.microsecond) other -= t return other else: return NotImplemented def __repr__(self): if self: if self.tzinfo is None: tz = '' else: diff = self._time.tzinfo.utcoffset(self._time) hours, minutes = divmod(diff.days * 86400 + diff.seconds, 3600) minus, hours = hours < 0, abs(hours) tz = ', tzinfo=<%s %s%02d%02d>' % (self._time.tzinfo.tzname(self._time), ('','-')[minus], hours, minutes) return "Time(%d, %d, %d, %d%s)" % (self.hour, self.minute, self.second, self.microsecond, tz) else: return "Time()" def __str__(self): if self: return unicode(self._time) return "" def __sub__(self, other): if self and isinstance(other, (Time, datetime.time)): t = self._time t = datetime.datetime(2012, 6, 27, t.hour, t.minute, t.second, t.microsecond) o = datetime.datetime(2012, 6, 27, other.hour, other.minute, other.second, other.microsecond) return t - o elif self and isinstance(other, (datetime.timedelta)): t = self._time t = datetime.datetime(2012, 6, 27, t.hour, t.minute, t.second, t.microsecond) t -= other return Time(t.hour, t.minute, t.second, t.microsecond) else: return NotImplemented @classmethod def fromfloat(cls, num): "2.5 == 2 hours, 30 minutes, 0 seconds, 0 microseconds" if num < 0: raise ValueError("positive value required (got %r)" % num) if num == 0: return Time(0) hours = int(num) if hours: num = num % hours minutes = int(num * 60) if minutes: num = num * 60 % minutes else: num = num * 60 seconds = int(num * 60) if seconds: num = num * 60 % seconds else: num = num * 60 microseconds = int(num * 1000) return Time(hours, minutes, seconds, microseconds) @staticmethod def now(tzinfo=None): "only accurate to milliseconds" return DateTime.now(tzinfo).timetz() def replace(self, hour=None, minute=None, second=None, microsecond=None, tzinfo=Null, delta_hour=0, delta_minute=0, delta_second=0): if not self: return self.__class__._null_time if tzinfo is Null: tzinfo = self._time.tzinfo old_hour, old_minute, old_second, old_micro = self.hour, self.minute, self.second, self.microsecond hour = (hour or old_hour) + delta_hour minute = (minute or old_minute) + delta_minute second = (second or old_second) + delta_second microsecond = microsecond or old_micro while not (0 <= hour < 24) or not (0 <= minute < 60) or not (0 <= second < 60): while second < 0: minute -= 1 second = 60 + second while second > 59: minute += 1 second = second - 60 while minute < 0: hour -= 1 minute = 60 + minute while minute > 59: hour += 1 minute = minute - 60 while hour < 1: hour = 24 + hour while hour > 23: hour = hour - 24 return Time(hour, minute, second, microsecond, tzinfo) def strftime(self, format): fmt_cls = type(format) if self: return fmt_cls(self._time.strftime(format)) return fmt_cls('') @classmethod def strptime(cls, time_string, format=None): if format is not None: return cls(*time.strptime(time_string, format)[3:6]) for format in ( "%H:%M:%S.%f", "%H:%M:%S", ): try: return cls(*time.strptime(time_string, format)[3:6]) except ValueError: pass raise ValueError("Unable to convert %r" % time_string) def time(self): if self: return self._time return None def tofloat(self): "returns Time as a float" hour = self.hour minute = self.minute * (1.0 / 60) second = self.second * (1.0 / 3600) microsecond = self.microsecond * (1.0 / 3600000) return hour + minute + second + microsecond Time.max = Time(datetime.time.max) Time.min = Time(datetime.time.min) Time._null_time = object.__new__(Time) Time._null_time._time = None NullTime = Time() class Period(object): "for matching various time ranges" def __init__(self, year=None, month=None, day=None, hour=None, minute=None, second=None, microsecond=None): params = vars() self._mask = {} # if year: attrs = [] if isinstance(year, (Date, datetime.date)): attrs = ['year','month','day'] elif isinstance(year, (DateTime, datetime.datetime)): attrs = ['year','month','day','hour','minute','second'] elif isinstance(year, (Time, datetime.time)): attrs = ['hour','minute','second'] for attr in attrs: value = getattr(year, attr) self._mask[attr] = value # for attr in ('year', 'month', 'day', 'hour', 'minute', 'second', 'microsecond'): value = params[attr] if value is not None: self._mask[attr] = value def __contains__(self, other): if not self._mask: return True for attr, value in self._mask.items(): other_value = getattr(other, attr, None) try: if other_value == value or other_value in value: continue except TypeError: pass return False return True def __repr__(self): items = [] for attr in ('year', 'month', 'day', 'hour', 'minute', 'second', 'microsecond'): if attr in self._mask: items.append('%s=%s' % (attr, self._mask[attr])) return "Period(%s)" % ', '.join(items) class Logical(object): """ Logical field return type. Accepts values of True, False, or None/Null. boolean value of Unknown is False (use Quantum if you want an exception instead. """ def __new__(cls, value=None): if value is None or value is Null or value is Other or value is Unknown: return cls.unknown elif isinstance(value, basestring): if value.lower() in ('t', 'true', 'y', 'yes', 'on'): return cls.true elif value.lower() in ('f', 'false', 'n', 'no', 'off'): return cls.false elif value.lower() in ('?', 'unknown', 'null', 'none', ' ', ''): return cls.unknown else: raise ValueError('unknown value for Logical: %s' % value) else: return (cls.false, cls.true)[bool(value)] def __add__(x, y): if isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x) + i __radd__ = __iadd__ = __add__ def __sub__(x, y): if isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x) - i __isub__ = __sub__ def __rsub__(y, x): if isinstance(x, type(None)) or x is Unknown or y is Unknown: return Unknown try: i = int(x) except Exception: return NotImplemented return i - int(y) def __mul__(x, y): if x == 0 or y == 0: return 0 elif isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x) * i __rmul__ = __imul__ = __mul__ def __div__(x, y): if isinstance(y, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x).__div__(i) __idiv__ = __div__ def __rdiv__(y, x): if isinstance(x, type(None)) or y == 0 or x is Unknown or y is Unknown: return Unknown try: i = int(x) except Exception: return NotImplemented return i.__div__(int(y)) def __truediv__(x, y): if isinstance(y, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x).__truediv__(i) __itruediv__ = __truediv__ def __rtruediv__(y, x): if isinstance(x, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(x) except Exception: return NotImplemented return i.__truediv__(int(y)) def __floordiv__(x, y): if isinstance(y, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x).__floordiv__(i) __ifloordiv__ = __floordiv__ def __rfloordiv__(y, x): if isinstance(x, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(x) except Exception: return NotImplemented return i.__floordiv__(int(y)) def __divmod__(x, y): if isinstance(y, type(None)) or y == 0 or y is Unknown or x is Unknown: return (Unknown, Unknown) try: i = int(y) except Exception: return NotImplemented return divmod(int(x), i) def __rdivmod__(y, x): if isinstance(x, type(None)) or y == 0 or y is Unknown or x is Unknown: return (Unknown, Unknown) try: i = int(x) except Exception: return NotImplemented return divmod(i, int(y)) def __mod__(x, y): if isinstance(y, type(None)) or y == 0 or y is Unknown or x is Unknown: return Unknown try: i = int(y) except Exception: return NotImplemented return int(x) % i __imod__ = __mod__ def __rmod__(y, x): if isinstance(x, type(None)) or y == 0 or x is Unknown or y is Unknown: return Unknown try: i = int(x) except Exception: return NotImplemented return i % int(y) def __pow__(x, y): if not isinstance(y, (x.__class__, bool, type(None), baseinteger)): return NotImplemented if isinstance(y, type(None)) or y is Unknown: return Unknown i = int(y) if i == 0: return 1 if x is Unknown: return Unknown return int(x) ** i __ipow__ = __pow__ def __rpow__(y, x): if not isinstance(x, (y.__class__, bool, type(None), baseinteger)): return NotImplemented if y is Unknown: return Unknown i = int(y) if i == 0: return 1 if x is Unknown or isinstance(x, type(None)): return Unknown return int(x) ** i def __lshift__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return Unknown return int(x.value) << int(y) __ilshift__ = __lshift__ def __rlshift__(y, x): if isinstance(x, type(None)) or x is Unknown or y is Unknown: return Unknown return int(x) << int(y) def __rshift__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return Unknown return int(x.value) >> int(y) __irshift__ = __rshift__ def __rrshift__(y, x): if isinstance(x, type(None)) or x is Unknown or y is Unknown: return Unknown return int(x) >> int(y) def __neg__(x): "NEG (negation)" if x in (Truth, Falsth): return -x.value return Unknown def __pos__(x): "POS (posation)" if x in (Truth, Falsth): return +x.value return Unknown def __abs__(x): if x in (Truth, Falsth): return abs(x.value) return Unknown def __invert__(x): if x in (Truth, Falsth): return (Truth, Falsth)[x.value] return Unknown def __complex__(x): if x.value is None: raise ValueError("unable to return complex() of %r" % x) return complex(x.value) def __int__(x): if x.value is None: raise ValueError("unable to return int() of %r" % x) return int(x.value) if py_ver < (3, 0): def __long__(x): if x.value is None: raise ValueError("unable to return long() of %r" % x) return long(x.value) def __float__(x): if x.value is None: raise ValueError("unable to return float() of %r" % x) return float(x.value) if py_ver < (3, 0): def __oct__(x): if x.value is None: raise ValueError("unable to return oct() of %r" % x) return oct(x.value) def __hex__(x): if x.value is None: raise ValueError("unable to return hex() of %r" % x) return hex(x.value) def __and__(x, y): """ AND (conjunction) x & y: True iff both x, y are True False iff at least one of x, y is False Unknown otherwise """ if (isinstance(x, baseinteger) and not isinstance(x, bool)) or (isinstance(y, baseinteger) and not isinstance(y, bool)): if x == 0 or y == 0: return 0 elif x is Unknown or y is Unknown: return Unknown return int(x) & int(y) elif x in (False, Falsth) or y in (False, Falsth): return Falsth elif x in (True, Truth) and y in (True, Truth): return Truth elif isinstance(x, type(None)) or isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown return NotImplemented __rand__ = __and__ def __or__(x, y): "OR (disjunction): x | y => True iff at least one of x, y is True" if (isinstance(x, baseinteger) and not isinstance(x, bool)) or (isinstance(y, baseinteger) and not isinstance(y, bool)): if x is Unknown or y is Unknown: return Unknown return int(x) | int(y) elif x in (True, Truth) or y in (True, Truth): return Truth elif x in (False, Falsth) and y in (False, Falsth): return Falsth elif isinstance(x, type(None)) or isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown return NotImplemented __ror__ = __or__ def __xor__(x, y): "XOR (parity) x ^ y: True iff only one of x,y is True" if (isinstance(x, baseinteger) and not isinstance(x, bool)) or (isinstance(y, baseinteger) and not isinstance(y, bool)): if x is Unknown or y is Unknown: return Unknown return int(x) ^ int(y) elif x in (True, Truth, False, Falsth) and y in (True, Truth, False, Falsth): return { (True, True) : Falsth, (True, False) : Truth, (False, True) : Truth, (False, False): Falsth, }[(x, y)] elif isinstance(x, type(None)) or isinstance(y, type(None)) or y is Unknown or x is Unknown: return Unknown return NotImplemented __rxor__ = __xor__ if py_ver < (3, 0): def __nonzero__(x): "boolean value of Unknown is assumed False" return x.value is True else: def __bool__(x): "boolean value of Unknown is assumed False" return x.value is True def __eq__(x, y): if isinstance(y, x.__class__): return x.value == y.value elif isinstance(y, (bool, type(None), baseinteger)): return x.value == y return NotImplemented def __ge__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return x.value == None elif isinstance(y, x.__class__): return x.value >= y.value elif isinstance(y, (bool, baseinteger)): return x.value >= y return NotImplemented def __gt__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return False elif isinstance(y, x.__class__): return x.value > y.value elif isinstance(y, (bool, baseinteger)): return x.value > y return NotImplemented def __le__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return x.value == None elif isinstance(y, x.__class__): return x.value <= y.value elif isinstance(y, (bool, baseinteger)): return x.value <= y return NotImplemented def __lt__(x, y): if isinstance(y, type(None)) or x is Unknown or y is Unknown: return False elif isinstance(y, x.__class__): return x.value < y.value elif isinstance(y, (bool, baseinteger)): return x.value < y return NotImplemented def __ne__(x, y): if isinstance(y, x.__class__): return x.value != y.value elif isinstance(y, (bool, type(None), baseinteger)): return x.value != y return NotImplemented def __hash__(x): return hash(x.value) def __index__(x): if x.value is None: raise ValueError("unable to return int() of %r" % x) return int(x.value) def __repr__(x): return "Logical(%r)" % x.string def __str__(x): return x.string Logical.true = object.__new__(Logical) Logical.true.value = True Logical.true.string = 'T' Logical.false = object.__new__(Logical) Logical.false.value = False Logical.false.string = 'F' Logical.unknown = object.__new__(Logical) Logical.unknown.value = None Logical.unknown.string = '?' Truth = Logical(True) Falsth = Logical(False) Unknown = Logical() class Quantum(object): """ Logical field return type that implements boolean algebra Accepts values of True/On, False/Off, or None/Null/Unknown/Other """ def __new__(cls, value=None): if value is None or value is Null or value is Other or value is Unknown: return cls.unknown elif isinstance(value, basestring): if value.lower() in ('t', 'true', 'y', 'yes', 'on'): return cls.true elif value.lower() in ('f', 'false', 'n', 'no', 'off'): return cls.false elif value.lower() in ('?', 'unknown', 'null', 'none', ' ', ''): return cls.unknown else: raise ValueError('unknown value for Quantum: %s' % value) else: return (cls.false, cls.true)[bool(value)] def A(x, y): "OR (disjunction): x | y => True iff at least one of x, y is True" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if x.value is True or y is not Other and y == True: return x.true elif x.value is False and y is not Other and y == False: return x.false return Other def _C_material(x, y): "IMP (material implication) x >> y => False iff x == True and y == False" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if (x.value is False or (x.value is True and y is not Other and y == True)): return x.true elif x.value is True and y is not Other and y == False: return False return Other def _C_material_reversed(y, x): "IMP (material implication) x >> y => False iff x = True and y = False" if not isinstance(x, (y.__class__, bool, NullType, type(None))): return NotImplemented if (x is not Other and x == False or (x is not Other and x == True and y.value is True)): return y.true elif x is not Other and x == True and y.value is False: return y.false return Other def _C_relevant(x, y): "IMP (relevant implication) x >> y => True iff both x, y are True, False iff x == True and y == False, Other if x is False" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if x.value is True and y is not Other and y == True: return x.true if x.value is True and y is not Other and y == False: return x.false return Other def _C_relevant_reversed(y, x): "IMP (relevant implication) x >> y => True iff both x, y are True, False iff x == True and y == False, Other if y is False" if not isinstance(x, (y.__class__, bool, NullType, type(None))): return NotImplemented if x is not Other and x == True and y.value is True: return y.true if x is not Other and x == True and y.value is False: return y.false return Other def D(x, y): "NAND (negative AND) x.D(y): False iff x and y are both True" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if x.value is False or y is not Other and y == False: return x.true elif x.value is True and y is not Other and y == True: return x.false return Other def E(x, y): "EQV (equivalence) x.E(y): True iff x and y are the same" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented elif ( (x.value is True and y is not Other and y == True) or (x.value is False and y is not Other and y == False) ): return x.true elif ( (x.value is True and y is not Other and y == False) or (x.value is False and y is not Other and y == True) ): return x.false return Other def J(x, y): "XOR (parity) x ^ y: True iff only one of x,y is True" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if ( (x.value is True and y is not Other and y == False) or (x.value is False and y is not Other and y == True) ): return x.true if ( (x.value is False and y is not Other and y == False) or (x.value is True and y is not Other and y == True) ): return x.false return Other def K(x, y): "AND (conjunction) x & y: True iff both x, y are True" if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if x.value is True and y is not Other and y == True: return x.true elif x.value is False or y is not Other and y == False: return x.false return Other def N(x): "NEG (negation) -x: True iff x = False" if x is x.true: return x.false elif x is x.false: return x.true return Other @classmethod def set_implication(cls, method): "sets IMP to material or relevant" if not isinstance(method, basestring) or string(method).lower() not in ('material', 'relevant'): raise ValueError("method should be 'material' (for strict boolean) or 'relevant', not %r'" % method) if method.lower() == 'material': cls.C = cls._C_material cls.__rshift__ = cls._C_material cls.__rrshift__ = cls._C_material_reversed elif method.lower() == 'relevant': cls.C = cls._C_relevant cls.__rshift__ = cls._C_relevant cls.__rrshift__ = cls._C_relevant_reversed def __eq__(x, y): if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if ( (x.value is True and y is not Other and y == True) or (x.value is False and y is not Other and y == False) ): return x.true elif ( (x.value is True and y is not Other and y == False) or (x.value is False and y is not Other and y == True) ): return x.false return Other def __hash__(x): return hash(x.value) def __ne__(x, y): if not isinstance(y, (x.__class__, bool, NullType, type(None))): return NotImplemented if ( (x.value is True and y is not Other and y == False) or (x.value is False and y is not Other and y == True) ): return x.true elif ( (x.value is True and y is not Other and y == True) or (x.value is False and y is not Other and y == False) ): return x.false return Other if py_ver < (3, 0): def __nonzero__(x): if x is Other: raise TypeError('True/False value of %r is unknown' % x) return x.value is True else: def __bool__(x): if x is Other: raise TypeError('True/False value of %r is unknown' % x) return x.value is True def __repr__(x): return "Quantum(%r)" % x.string def __str__(x): return x.string __add__ = A __and__ = K __mul__ = K __neg__ = N __or__ = A __radd__ = A __rand__ = K __rshift__ = None __rmul__ = K __ror__ = A __rrshift__ = None __rxor__ = J __xor__ = J Quantum.true = object.__new__(Quantum) Quantum.true.value = True Quantum.true.string = 'Y' Quantum.false = object.__new__(Quantum) Quantum.false.value = False Quantum.false.string = 'N' Quantum.unknown = object.__new__(Quantum) Quantum.unknown.value = None Quantum.unknown.string = '?' Quantum.set_implication('material') On = Quantum(True) Off = Quantum(False) Other = Quantum() # add xmlrpc support if py_ver < (3, 0): from xmlrpclib import Marshaller else: from xmlrpc.client import Marshaller # Char is unicode Marshaller.dispatch[Char] = Marshaller.dump_unicode # Logical unknown becomes False Marshaller.dispatch[Logical] = Marshaller.dump_bool # DateTime is transmitted as UTC if aware, local if naive Marshaller.dispatch[DateTime] = lambda s, dt, w: w( '' '%04d%02d%02dT%02d:%02d:%02d' '\n' % dt.utctimetuple()[:6]) del Marshaller ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/exceptions.py0000664000175000017500000000602614770560041014745 0ustar00ethanethan# warnings and errors class _undef(object): def __repr__(self): return 'not defined' _undef = _undef() def exception(exc, cause=_undef, context=_undef, traceback=_undef): if cause is not _undef: exc.__cause__ = cause if context is not _undef: exc.__context__ = context if traceback is not _undef: exc.__traceback__ = traceback return exc class DbfError(Exception): """ Fatal errors elicit this response. """ def __init__(self, message, *args): Exception.__init__(self, message, *args) self.message = message def from_exc(self, exc): self.__cause__ = exc return self def with_traceback(self, tb): self.__traceback__ = tb return self class DataOverflowError(DbfError): """ Data too large for field """ def __init__(self, message, data=None): DbfError.__init__(self, message) self.data = data class BadDataError(DbfError): """ bad data in table """ def __init__(self, message, data=None): DbfError.__init__(self, message) self.data = data class FieldMissingError(KeyError, DbfError): """ Field does not exist in table """ def __init__(self, fieldname): KeyError.__init__(self, '%s: no such field in table' % fieldname) DbfError.__init__(self, '%s: no such field in table' % fieldname) self.data = fieldname class FieldSpecError(DbfError, ValueError): """ invalid field specification """ def __init__(self, message): ValueError.__init__(self, message) DbfError.__init__(self, message) class NonUnicodeError(DbfError): """ Data for table not in unicode """ def __init__(self, message=None): DbfError.__init__(self, message) class NotFoundError(DbfError, ValueError, KeyError, IndexError): """ record criteria not met """ def __init__(self, message=None, data=None): ValueError.__init__(self, message) KeyError.__init__(self, message) IndexError.__init__(self, message) DbfError.__init__(self, message) self.data = data class DbfWarning(UserWarning): """ Normal operations elicit this response """ class Eof(DbfWarning, StopIteration): """ End of file reached """ message = 'End of file reached' def __init__(self): StopIteration.__init__(self, self.message) DbfWarning.__init__(self, self.message) class Bof(DbfWarning, StopIteration): """ Beginning of file reached """ message = 'Beginning of file reached' def __init__(self): StopIteration.__init__(self, self.message) DbfWarning.__init__(self, self.message) class DoNotIndex(DbfWarning): """ Returned by indexing functions to suppress a record from becoming part of the index """ message = 'Not indexing record' def __init__(self): DbfWarning.__init__(self, self.message) class FieldNameWarning(UserWarning): message = 'non-standard characters in field name' ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/index.py0000664000175000017500000002734614770560041013703 0ustar00ethanethanfrom bisect import bisect_left, bisect_right from functools import partial import struct import weakref class IndexFile(_Navigation): pass class BytesType(object): def __init__(self, offset): self.offset = offset def __get__(self, inst, cls=None): if inst is None: return self start = self.offset end = start + self.size byte_data = inst._data[start:end] return self.from_bytes(byte_data) def __set__(self, inst, value): start = self.offset end = start + self.size byte_data = self.to_bytes(value) inst._data = inst._data[:start] + byte_data + inst._data[end:] class IntBytesType(BytesType): """ add big_endian and neg_one to __init__ """ def __init__(self, offset, big_endian=False, neg_one_is_none=False, one_based=False): self.offset = offset self.big_endian = big_endian self.neg_one_is_none = neg_one_is_none self.one_based = one_based def from_bytes(self, byte_data): if self.neg_one_is_none and byte_data == '\xff' * self.size: return None if self.big_endian: value = struct.unpack('>%s' % self.code, byte_data)[0] else: value = struct.unpack('<%s' % self.code, byte_data)[0] if self.one_based: # values are stored one based, convert to standard Python zero-base value -= 1 return value def to_bytes(self, value): if value is None: if self.neg_one_is_none: return '\xff\xff' raise DbfError('unable to store None in %r' % self.__name__) limit = 2 ** (self.size * 8) - 1 if self.one_based: limit -= 1 if value > 2 ** limit: raise DataOverflowError("Maximum Integer size exceeded. Possible: %d. Attempted: %d" % (limit, value)) if self.one_based: value += 1 if self.big_endian: return struct.pack('>%s' % self.code, value) else: return struct.pack('<%s' % self.code, value) class Int8(IntBytesType): """ 1-byte integer """ size = 1 code = 'B' class Int16(IntBytesType): """ 2-byte integer """ size = 2 code = 'H' class Int32(IntBytesType): """ 4-byte integer """ size = 4 code = 'L' class Bytes(BytesType): def __init__(self, offset, size=0, fill_to=0, strip_null=False): if not (size or fill_to): raise DbfError("either size or fill_to must be specified") self.offset = offset self.size = size self.fill_to = fill_to self.strip_null = strip_null def from_bytes(self, byte_data): if self.strip_null: return byte_data.rstrip('\x00') else: return byte_data def to_bytes(self, value): if not isinstance(value, bytes): raise DbfError('value must be bytes [%r]' % value) if self.strip_null and len(value) < self.size: value += '\x00' * (self.size - len(value)) return value class DataBlock(object): """ adds _data as a str to class binds variable name to BytesType descriptor """ def __init__(self, size): self.size = size def __call__(self, cls): fields = [] initialized = stringified = False for name, thing in cls.__dict__.items(): if isinstance(thing, BytesType): thing.__name__ = name fields.append((name, thing)) elif name in ('__init__', '__new__'): initialized = True elif name in ('__repr__', ): stringified = True fields.sort(key=lambda t: t[1].offset) for _, field in fields: offset = field.offset if not field.size: field.size = field.fill_to - offset total_field_size = field.offset + field.size if self.size and total_field_size > self.size: raise DbfError('Fields in %r are using %d bytes, but only %d allocated' % (cls, total_field_size, self.size)) total_field_size = self.size or total_field_size cls._data = str('\x00' * total_field_size) cls.__len__ = lambda s: len(s._data) cls._size_ = total_field_size if not initialized: def init(self, data): if len(data) != self._size_: raise Exception('%d bytes required, received %d' % (self._size_, len(data))) self._data = data cls.__init__ = init if not stringified: def repr(self): clauses = [] for name, _ in fields: value = getattr(self, name) if isinstance(value, str) and len(value) > 12: value = value[:9] + '...' clauses.append('%s=%r' % (name, value)) return ('%s(%s)' % (cls.__name__, ', '.join(clauses))) cls.__repr__ = repr return cls class LruCache(object): """ keep the most recent n items in the dict based on code from Raymond Hettinger: http://stackoverflow.com/a/8334739/208880 """ class Link(object): __slots__ = 'prev_link', 'next_link', 'key', 'value' def __init__(self, prev=None, next=None, key=None, value=None): self.prev_link, self.next_link, self.key, self.value = prev, next, key, value def __iter__(self): return iter((self.prev_link, self.next_link, self.key, self.value)) def __repr__(self): value = self.value if isinstance(value, str) and len(value) > 15: value = value[:12] + '...' return 'Link' % (self.key, value) def __init__(self, maxsize, func=None): self.maxsize = maxsize self.mapping = {} self.tail = self.Link() # oldest self.head = self.Link(self.tail) # newest self.head.prev_link = self.tail self.func = func if func is not None: self.__name__ = func.__name__ self.__doc__ = func.__doc__ def __call__(self, *func): if self.func is None: [self.func] = func self.__name__ = func.__name__ self.__doc__ = func.__doc__ return self mapping, head, tail = self.mapping, self.head, self.tail link = mapping.get(func, head) if link is head: value = self.func(*func) if len(mapping) >= self.maxsize: old_prev, old_next, old_key, old_value = tail.next_link tail.next_link = old_next old_next.prev_link = tail del mapping[old_key] behind = head.prev_link link = self.Link(behind, head, func, value) mapping[func] = behind.next_link = head.prev_link = link else: link_prev, link_next, func, value = link link_prev.next_link = link_next link_next.prev_link = link_prev behind = head.prev_link behind.next_link = head.prev_link = link link.prev_link = behind link.next_link = head return value class Idx(object): # default numeric storage is little-endian # numbers used as key values, and the 4-byte numbers in leaf nodes are big-endian @DataBlock(512) class Header(object): root_node = Int32(0) free_node_list = Int32(4, neg_one_is_none=True) file_size = Int32(8) key_length = Int16(12) index_options = Int8(14) index_signature = Int8(15) key_expr = Bytes(16, 220, strip_null=True) for_expr = Bytes(236, 220, strip_null=True) @DataBlock(512) class Node(object): attributes = Int16(0) num_keys = Int16(2) left_peer = Int32(4, neg_one_is_none=True) right_peer = Int32(8, neg_one_is_none=True) pool = Bytes(12, fill_to=512) def __init__(self, byte_data, node_key, record_key): if len(byte_data) != 512: raise DbfError("incomplete header: only received %d bytes" % len(byte_data)) self._data = byte_data self._node_key = node_key self._record_key = record_key def is_leaf(self): return self.attributes in (2, 3) def is_root(self): return self.attributes in (1, 3) def is_interior(self): return self.attributes in (0, 1) def keys(self): result = [] if self.is_leaf(): key = self._record_key else: key = self._node_key key_len = key._size_ for i in range(self.num_keys): start = i * key_len end = start + key_len result.append(key(self.pool[start:end])) return result def __init__(self, table, filename, size_limit=100): self.table = weakref.ref(table) self.filename = filename self.limit = size_limit with open(filename, 'rb') as idx: self.header = header = self.Header(idx.read(512)) # offset = 512 @DataBlock(header.key_length+4) class NodeKey(object): key = Bytes(0, header.key_length) rec_no = Int32(header.key_length, big_endian=True) @DataBlock(header.key_length+4) class RecordKey(object): key = Bytes(0, header.key_length) rec_no = Int32(header.key_length, big_endian=True, one_based=True) self.NodeKey = NodeKey self.RecordKey = RecordKey # set up root node idx.seek(header.root_node) self.root_node = self.Node(idx.read(512), self.NodeKey, self.RecordKey) # set up node reader self.read_node = LruCache(maxsize=size_limit, func=self.read_node) # set up iterating members self.current_node = None self.current_key = None def __iter__(self): # find the first leaf node table = self.table() if table is None: raise DbfError('the database linked to %r has been closed' % self.filename) node = self.root_node if not node.num_keys: yield return while "looking for a leaf": # travel the links down to the first leaf node if node.is_leaf(): break node = self.read_node(node.keys()[0].rec_no) while "traversing nodes": for key in node.keys(): yield table[key.rec_no] next_node = node.right_peer if next_node is None: return node = self.read_node(next_node) forward = __iter__ def read_node(self, offset): """ reads the sector indicated, and returns a Node object """ with open(self.filename, 'rb') as idx: idx.seek(offset) return self.Node(idx.read(512), self.NodeKey, self.RecordKey) def backward(self): # find the last leaf node table = self.table() if table is None: raise DbfError('the database linked to %r has been closed' % self.filename) node = self.root_node if not node.num_keys: yield return while "looking for last leaf": # travel the links down to the last leaf node if node.is_leaf(): break node = self.read_node(node.keys()[-1].rec_no) while "traversing nodes": for key in reversed(node.keys()): yield table[key.rec_no] prev_node = node.left_peer if prev_node is None: return node = self.read_node(prev_node) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/pql.py0000664000175000017500000001642314770560041013362 0ustar00ethanethanfrom . import dbf from .bridge import * from .utils import ensure_unicode, field_names, source_table from .utils import delete, undelete, is_deleted, reset # SQL functions def pql_select(records, chosen_fields, condition, field_names): if chosen_fields != '*': field_names = chosen_fields.replace(' ', '').split(',') result = condition(records) result.modified = 0, 'record' + ('', 's')[len(result)>1] result.field_names = field_names return result def pql_update(records, command, condition, field_names): possible = condition(records) modified = pql_cmd(command, field_names)(possible) possible.modified = modified, 'record' + ('', 's')[modified>1] return possible def pql_delete(records, dead_fields, condition, field_names): deleted = condition(records) deleted.modified = len(deleted), 'record' + ('', 's')[len(deleted)>1] deleted.field_names = field_names if dead_fields == '*': for record in deleted: delete(record) else: keep = [f for f in field_names if f not in dead_fields.replace(' ', '').split(',')] for record in deleted: reset(record, keep_fields=keep) return deleted def pql_recall(records, all_fields, condition, field_names): if all_fields != '*': raise DbfError('SQL RECALL: fields must be * (only able to recover at the record level)') revivified = dbf.List() for record in condition(records): if is_deleted(record): revivified.append(record) undelete(record) revivified.modfied = len(revivified), 'record' + ('', 's')[len(revivified)>1] return revivified def pql_add(records, new_fields, condition, field_names): tables = set() possible = condition(records) for record in possible: tables.add(source_table(record)) for table in tables: table.add_fields(new_fields) possible.modified = len(tables), 'table' + ('', 's')[len(tables)>1] possible.field_names = field_names return possible def pql_drop(records, dead_fields, condition, field_names): tables = set() possible = condition(records) for record in possible: tables.add(source_table(record)) for table in tables: table.delete_fields(dead_fields) possible.modified = len(tables), 'table' + ('', 's')[len(tables)>1] possible.field_names = field_names return possible def pql_pack(records, command, condition, field_names): tables = set() possible = condition(records) for record in possible: tables.add(source_table(record)) for table in tables: table.pack() possible.modified = len(tables), 'table' + ('', 's')[len(tables)>1] possible.field_names = field_names return possible def pql_resize(records, fieldname_newsize, condition, field_names): tables = set() possible = condition(records) for record in possible: tables.add(source_table(record)) fieldname, newsize = fieldname_newsize.split() newsize = int(newsize) for table in tables: table.resize_field(fieldname, newsize) possible.modified = len(tables), 'table' + ('', 's')[len(tables)>1] possible.field_names = field_names return possible def pql_criteria(records, criteria): """ creates a function matching the pql criteria """ function = """def func(records): '''%s ''' _matched = dbf.List() for _rec in records: %s if %s: _matched.append(_rec) return _matched""" fields = [] criteria = ensure_unicode(criteria) uc_criteria = criteria.upper() for field in field_names(records): if field in uc_criteria: fields.append(field) criteria = criteria.replace('recno()', 'recno(_rec)').replace('is_deleted()', 'is_deleted(_rec)') fields = '\n '.join(['%s = _rec.%s' % (field.lower(), field) for field in fields]) g = dict() g['dbf'] = dbf.api g.update(dbf.pql_user_functions) function %= (criteria, fields, criteria) execute(function, g) return g['func'] def pql_cmd(command, field_names): """ creates a function matching to apply command to each record in records """ function = """def func(records): '''%s ''' _changed = 0 for _rec in records: _tmp = dbf.create_template(_rec) %s %s %s if _tmp != _rec: dbf.gather(_rec, _tmp) _changed += 1 return _changed""" fields = [] for field in field_names: if field in command: fields.append(field) command = command.replace('recno()', 'recno(_rec)').replace('is_deleted()', 'is_deleted(_rec)') pre_fields = '\n '.join(['%s = _tmp.%s' % (field.lower(), field) for field in fields]) post_fields = '\n '.join(['_tmp.%s = %s' % (field, field).lower() for field in fields]) g = dbf.pql_user_functions.copy() g['dbf'] = dbf.api g['recno'] = recno g['create_template'] = create_template g['gather'] = gather if ' with ' in command.lower(): offset = command.lower().index(' with ') command = command[:offset] + ' = ' + command[offset + 6:] function %= (command, pre_fields, command, post_fields) execute(function, g) return g['func'] def pqlc(records, command): """ recognized pql commands are SELECT, UPDATE | REPLACE, DELETE, RECALL, ADD, DROP """ close_table = False if isinstance(records, basestring): records = Table(records) close_table = True try: if not records: return dbf.List() command = ensure_unicode(command) pql_command = command uc_command = command.upper() if u' WHERE ' in uc_command: index = uc_command.find(u' WHERE ') condition = command[index+7:] command = command[:index] # command, condition = command.split(' where ', 1) condition = pql_criteria(records, condition) else: def condition(records): return records[:] name, command = command.split(' ', 1) command = command.strip() name = name.upper() fields = field_names(records) if pql_functions.get(name) is None: raise DbfError('unknown SQL command %r in %r' % (name.upper(), pql_command)) result = pql_functions[name](records, command, condition, fields) tables = set() for record in result: tables.add(source_table(record)) finally: if close_table: records.close() return result pql_functions = { u'SELECT' : pql_select, u'UPDATE' : pql_update, u'REPLACE': pql_update, u'INSERT' : None, u'DELETE' : pql_delete, u'RECALL' : pql_recall, u'ADD' : pql_add, u'DROP' : pql_drop, u'COUNT' : None, u'PACK' : pql_pack, u'RESIZE' : pql_resize, } def _nop(value): """ returns parameter unchanged """ return value def _normalize_tuples(tuples, length, filler): """ ensures each tuple is the same length, using filler[-missing] for the gaps """ final = [] for t in tuples: if len(t) < length: final.append( tuple([item for item in t] + filler[len(t)-length:]) ) else: final.append(t) return tuple(final) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/tables.py0000664000175000017500000063615614770560041014053 0ustar00ethanethanfrom __future__ import print_function from array import array from aenum import NamedTuple from bisect import bisect_left, bisect_right from collections import defaultdict from functools import partial from glob import glob from os import SEEK_END import codecs import csv import datetime import logging import os import struct import sys import traceback import warnings import weakref from . import dbf from . import pqlc from .bridge import * from .constants import * from .constants import _NULLFLAG from .data_types import * from .exceptions import * from .utils import ensure_unicode, field_names, gather, guess_table_type, recno, scatter, source_table from .utils import is_deleted class NullHandler(logging.Handler): """ This handler does nothing. It's intended to be used to avoid the "No handlers could be found for logger XXX" one-off warning. This is important for library code, which may contain code to log events. If a user of the library does not configure logging, the one-off warning might be produced; to avoid this, the library developer simply needs to instantiate a NullHandler and add it to the top-level logger of the library module or package. Taken from 2.7 lib. """ def handle(self, record): """Stub.""" def emit(self, record): """Stub.""" def createLock(self): self.lock = None logger = logging.getLogger('dbf') logger.addHandler(NullHandler()) temp_dir = os.environ.get("DBF_TEMP") or os.environ.get("TMP") or os.environ.get("TEMP") or "" # other constructs class LazyAttr(object): """ doesn't create object until actually accessed """ def __init__(self, func=None, doc=None): self.fget = func self.__doc__ = doc or func.__doc__ def __call__(self, func): self.fget = func def __get__(self, instance, owner): if instance is None: return self return self.fget(instance) class MutableDefault(object): """ Lives in the class, and on first access calls the supplied factory and maps the result into the instance it was called on """ def __init__(self, func): self._name = func.__name__ self.func = func def __call__(self): return self def __get__(self, instance, owner): result = self.func() if instance is not None: setattr(instance, self._name, result) return result def __repr__(self): result = self.func() return "MutableDefault(%r)" % (result, ) def none(*args, **kwargs): """ because we can't do `NoneType(*args, **kwargs)` """ return None class FieldnameList(list): "storage for field names" def __new__(cls, items=()): for item in items: item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) new_list = super(FieldnameList, cls).__new__(cls) return new_list def __init__(self, items=()): for item in items: self.append(item.upper()) def __add__(self, other_list): if not isinstance(other_list, list): return NotImplemented new_list = self[:] for item in other_list: item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() new_list.append(item) return new_list def __contains__(self, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() return item in list(self) def __getslice__(self, start, stop): return FieldnameList(super(FieldnameList, self).__getslice__(start, stop)) def __getitem__(self, thing): res = super(FieldnameList, self).__getitem__(thing) if isinstance(res, list): res = FieldnameList(res) return res def __iadd__(self, other_list): if not isinstance(other_list, list): return NotImplemented for item in other_list: item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() super(FieldnameList, self).append(item) return self def __radd__(self, other_list): if not isinstance(other_list, list): return NotImplemented new_list = FieldnameList(other_list) new_list.extend(self) return new_list def __repr__(self): return 'FieldnameList(%s)' % super(FieldnameList, self).__repr__() def __setitem__(self, pos, item): if isinstance(item, list): if not isinstance(pos, slice): raise TypeError('%r cannot be a single field name' % item) try: new_things = [] for thing in item: thing = ensure_unicode(thing) if not isinstance(thing, unicode): raise TypeError('%r cannot be a field name' % (thing, )) thing = thing.upper() new_things.append(thing) item = new_things except TypeError: raise TypeError('%r cannot be a field name' % (thing, )) else: item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() return super(FieldnameList, self).__setitem__(pos, item) def __cmp__(self, other): if not isinstance(other, list): return NotImplemented for s, o in zip(self, other): o = ensure_unicode(o) if not isinstance(o, unicode): raise TypeError('%r cannot be a field name' % (o, )) o = o.upper() if s < o: return -1 elif s > o: return +1 # at least one list exhausted with all elements equal # now check lengths if len(s) < len(o): return -1 if len(s) > len(o): return +1 else: return 0 def __eq__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res == 0 def __ne__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res != 0 def __le__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res <= 0 def __lt__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res < 0 def __gt__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res > 0 def __ge__(self, other): res = self.__cmp__(other) if res is NotImplemented: return res else: return res >= 0 def append(self, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() super(FieldnameList, self).append(item) def count(self, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() return super(FieldnameList, self).count(item) def extend(self, other_list): for item in other_list: self.append(item) def index(self, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() return super(FieldnameList, self).index(item) def insert(self, pos, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() super(FieldnameList, self).insert(pos, item) def remove(self, item): item = ensure_unicode(item) if not isinstance(item, unicode): raise TypeError('%r cannot be a field name' % (item, )) item = item.upper() return super(FieldnameList, self).remove(item) # Internal classes class _Navigation(object): """ Navigation base class that provides VPFish movement methods """ _index = -1 def _nav_check(self): """ implemented by subclass; must return True if underlying structure meets need """ raise NotImplementedError() def _get_index(self, direction, n=1, start=None): """ returns index of next available record towards direction """ if start is not None: index = start else: index = self._index if direction == 'reverse': move = -1 * n limit = 0 index += move if index < limit: return -1 else: return index elif direction == 'forward': move = +1 * n limit = len(self) - 1 index += move if index > limit: return len(self) else: return index else: raise ValueError("direction should be 'forward' or 'reverse', not %r" % direction) @property def bof(self): """ returns True if no more usable records towards the beginning of the table """ self._nav_check() index = self._get_index('reverse') return index == -1 def bottom(self): """ sets record index to bottom of table (end of table) """ self._nav_check() self._index = len(self) return self._index @property def current_record(self): """ returns current record (deleted or not) """ self._nav_check() index = self._index if index < 0: return RecordVaporWare('bof', self) elif index >= len(self): return RecordVaporWare('eof', self) return self[index] @property def current(self): """ returns current index """ self._nav_check() return self._index @property def eof(self): """ returns True if no more usable records towards the end of the table """ self._nav_check() index = self._get_index('forward') return index == len(self) @property def first_record(self): """ returns first available record (does not move index) """ self._nav_check() index = self._get_index('forward', start=-1) if -1 < index < len(self): return self[index] else: return RecordVaporWare('bof', self) def goto(self, where): """ changes the record pointer to the first matching (deleted) record where should be either an integer, or 'top' or 'bottom'. top -> before first record bottom -> after last record """ self._nav_check() max = len(self) if isinstance(where, baseinteger): if not -max <= where < max: raise IndexError("Record %d does not exist" % where) if where < 0: where += max self._index = where return self._index move = getattr(self, where, None) if move is None: raise DbfError("unable to go to %r" % where) return move() @property def last_record(self): """ returns last available record (does not move index) """ self._nav_check() index = self._get_index('reverse', start=len(self)) if -1 < index < len(self): return self[index] else: return RecordVaporWare('bof', self) @property def next_record(self): """ returns next available record (does not move index) """ self._nav_check() index = self._get_index('forward') if -1 < index < len(self): return self[index] else: return RecordVaporWare('eof', self) @property def prev_record(self): """ returns previous available record (does not move index) """ self._nav_check() index = self._get_index('reverse') if -1 < index < len(self): return self[index] else: return RecordVaporWare('bof', self) def skip(self, n=1): """ move index to the next nth available record """ self._nav_check() if n < 0: n *= -1 direction = 'reverse' else: direction = 'forward' self._index = index = self._get_index(direction, n) if index < 0: raise Bof() elif index >= len(self): raise Eof() else: return index def top(self): """ sets record index to top of table (beginning of table) """ self._nav_check() self._index = -1 return self._index class Record(object): """ Provides routines to extract and save data within the fields of a dbf record. """ __slots__ = ('_recnum', '_meta', '_data', '_old_data', '_dirty', '_memos', '_write_to_disk', '__weakref__') def __new__(cls, recnum, layout, kamikaze=b'', _fromdisk=False): """ record = ascii array of entire record; layout=record specification; memo = memo object for table """ record = object.__new__(cls) record._dirty = False record._recnum = recnum record._meta = layout record._memos = {} record._write_to_disk = True record._old_data = None record._data = layout.blankrecord[:] if kamikaze and len(record._data) != len(kamikaze): raise BadDataError("record data is not the correct length (should be %r, not %r)" % (len(record._data), len(kamikaze)), data=kamikaze[:]) if recnum == -1: # not a disk-backed record return record elif type(kamikaze) == array: record._data = kamikaze[:] elif type(kamikaze) == bytes: if kamikaze: record._data = array('B', kamikaze) else: raise BadDataError("%r recieved for record data" % kamikaze) if record._data[0] == NULL: record._data[0] = SPACE if record._data[0] not in (SPACE, ASTERISK): # TODO: log warning instead logger.error( "record %d: invalid delete byte %h (should be SPACE or ASTERISK). " "Record will be considered active", record._data[0] ) if not _fromdisk and layout.location == ON_DISK: record._update_disk() return record def __contains__(self, value): for field in self._meta.user_fields: if self[field] == value: return True return False def __enter__(self): if not self._write_to_disk: raise DbfError("`with record` is not reentrant") self._start_flux() return self def __eq__(self, other): if not isinstance(other, (Record, RecordTemplate, dict, tuple)): return NotImplemented if isinstance(other, (Record, RecordTemplate)): if field_names(self) != field_names(other): return False for field in self._meta.user_fields: s_value, o_value = self[field], other[field] if s_value is not o_value and s_value != o_value: return False elif isinstance(other, dict): other = dict((ensure_unicode(k).upper(), v) for k, v in other.items()) if sorted(field_names(self)) != sorted(other.keys()): return False for field in self._meta.user_fields: s_value, o_value = self[field], other[field] if s_value is not o_value and s_value != o_value: return False else: # tuple if len(self) != len(other): return False for s_value, o_value in zip(self, other): if s_value is not o_value and s_value != o_value: return False return True def __exit__(self, *args): if args == (None, None, None): self._commit_flux() else: self._rollback_flux() def __iter__(self): return (self[field] for field in self._meta.user_fields) def __getattr__(self, name): if name[0:2] == '__' and name[-2:] == '__': raise AttributeError('Method %s is not implemented.' % name) name = name.upper() if not name in self._meta.fields: raise FieldMissingError(name) if name in self._memos: return self._memos[name] try: value = self._retrieve_field_value(name) return value except DbfError: error = sys.exc_info()[1] error.message = "error accessing field %r: %s" % (name, error.message) raise def __getitem__(self, item): if isinstance(item, baseinteger): fields = self._meta.user_fields field_count = len(fields) if not -field_count <= item < field_count: raise NotFoundError("Field offset %d is not in record" % item) field = fields[item] if field in self._memos: return self._memos[field] return self[field] elif isinstance(item, slice): sequence = [] if isinstance(item.start, basestring) or isinstance(item.stop, basestring): names = field_names(self) start, stop, step = ensure_unicode(item.start).upper(), ensure_unicode(item.stop).upper(), item.step if start not in names or stop not in names: raise FieldMissingError("Either %r or %r (or both) are not valid field names" % (start, stop)) if step is not None and not isinstance(step, baseinteger): raise DbfError("step value must be an intger, not %r" % type(step)) start = names.index(start) stop = names.index(stop) + 1 item = slice(start, stop, step) for index in self._meta.fields[item]: sequence.append(self[index]) return sequence elif isinstance(item, basestring): return self.__getattr__(item) else: raise TypeError("%r is not a field name" % item) def __len__(self): return self._meta.user_field_count def __ne__(self, other): if not isinstance(other, (Record, RecordTemplate, dict, tuple)): return NotImplemented return not self == other def __setattr__(self, name, value): if name in self.__slots__: object.__setattr__(self, name, value) return name = name.upper() if self._meta.status != READ_WRITE: raise DbfError("%s not in read/write mode" % self._meta.filename) elif self._write_to_disk: raise DbfError("unable to modify fields individually except in `with` or `Process()`") elif not name in self._meta.fields: raise FieldMissingError(name) if name in self._meta.memofields: self._memos[name] = value self._dirty = True return try: self._update_field_value(name, value) except DbfError: error = sys.exc_info()[1] message = "field %r: %s" % (name, error.args) data = name err_cls = error.__class__ raise err_cls(message, data) def __setitem__(self, name, value): if self._meta.status != READ_WRITE: raise DbfError("%s not in read/write mode" % self._meta.filename) if self._write_to_disk: raise DbfError("unable to modify fields individually except in `with` or `Process()`") if isinstance(name, basestring): self.__setattr__(name.upper(), value) elif isinstance(name, baseinteger): self.__setattr__(self._meta.fields[name], value) elif isinstance(name, slice): sequence = [] names = field_names(self) if isinstance(name.start, basestring) or isinstance(name.stop, basestring): start, stop, step = ensure_unicode(name.start).upper(), ensure_unicode(name.stop).upper(), name.step if start not in names or stop not in names: raise FieldMissingError("Either %r or %r (or both) are not valid field names" % (start, stop)) if step is not None and not isinstance(step, baseinteger): raise DbfError("step value must be an integer, not %r" % type(step)) start = names.index(start) stop = names.index(stop) + 1 name = slice(start, stop, step) for field in self._meta.fields[name]: sequence.append(field) if len(sequence) != len(value): raise DbfError("length of slices not equal") for field, val in zip(sequence, value): self[field] = val else: raise TypeError("%s is not a field name" % name) def __str__(self): result = [] for seq, field in enumerate(field_names(self)): result.append("%3d - %-10s: %r" % (seq, field, self[field])) return '\n'.join(result) def __repr__(self): return '%r' % to_bytes(self._data) def _commit_flux(self): """ stores field updates to disk; if any errors restores previous contents and propogates exception """ if self._write_to_disk: raise DbfError("record not in flux") try: self._write() except Exception: exc = sys.exc_info()[1] self._data[:] = self._old_data self._update_disk(data=self._old_data) raise DbfError("unable to write updates to disk, original data restored: %r" % (exc,)).from_exc(None) self._memos.clear() self._old_data = None self._write_to_disk = True self._reindex_record() @classmethod def _create_blank_data(cls, layout): """ creates a blank record data chunk """ record = object.__new__(cls) record._dirty = False record._recnum = -1 record._meta = layout record._data = array('B', b' ' * layout.header.record_length) layout.memofields = [] signature = [layout.table().codepage.name] for index, name in enumerate(layout.fields): if name == '_NULLFLAGS': record._data[layout['_NULLFLAGS'][START]:layout['_NULLFLAGS'][END]] = array('B', [0xff] * layout['_NULLFLAGS'][LENGTH]) for index, name in enumerate(layout.fields): signature.append(name) if name != '_NULLFLAGS': type = FieldType(layout[name][TYPE]) start = layout[name][START] size = layout[name][LENGTH] end = layout[name][END] blank = layout.fieldtypes[type]['Blank'] record._data[start:end] = array('B', blank(size)) if layout[name][TYPE] in layout.memo_types: layout.memofields.append(name) decimals = layout[name][DECIMALS] signature[-1] = '_'.join([unicode(x) for x in (signature[-1], type.symbol, size, decimals)]) layout.blankrecord = record._data[:] data_types = [] for fieldtype, defs in sorted(layout.fieldtypes.items()): if fieldtype != _NULLFLAG: # ignore the nullflags field data_types.append("%s_%s_%s" % (fieldtype.symbol, defs['Empty'], defs['Class'])) layout.record_sig = ('___'.join(signature), '___'.join(data_types)) def _reindex_record(self): """ rerun all indices with this record """ if self._meta.status == CLOSED: raise DbfError("%s is closed; cannot alter indices" % self._meta.filename) elif not self._write_to_disk: raise DbfError("unable to reindex record until it is written to disk") for dbfindex in self._meta.table()._indexen: dbfindex(self) def _retrieve_field_value(self, name): """ calls appropriate routine to convert value stored in field from array """ # check nullable here, binary is handled in the appropriate retrieve_* functions # index = self._meta.fields.index(name) fielddef = self._meta[name] flags = fielddef[FLAGS] nullable = flags & NULLABLE and '_NULLFLAGS' in self._meta if nullable: index = fielddef[NUL] byte, bit = divmod(index, 8) null_def = self._meta['_NULLFLAGS'] null_data = self._data[null_def[START]:null_def[END]] try: if null_data[byte] >> bit & 1: return Null except IndexError: print(null_data) print(index) print(byte, bit) print(len(self._data), self._data) print(null_def) print(null_data) raise record_data = self._data[fielddef[START]:fielddef[END]] field_type = fielddef[TYPE] retrieve = self._meta.fieldtypes[field_type]['Retrieve'] datum = retrieve(record_data, fielddef, self._meta.memo, self._meta.decoder) return datum def _rollback_flux(self): """ discards all changes since ._start_flux() """ if self._write_to_disk: raise DbfError("record not in flux") self._data = self._old_data self._old_data = None self._memos.clear() self._write_to_disk = True self._write() def _start_flux(self): """ Allows record.field_name = ... and record[...] = ...; must use ._commit_flux() to commit changes """ if self._meta.status == CLOSED: raise DbfError("%s is closed; cannot modify record" % self._meta.filename) elif self._recnum < 0: raise DbfError("record has been packed; unable to update") elif not self._write_to_disk: raise DbfError("record already in a state of flux") self._old_data = self._data[:] self._write_to_disk = False def _update_field_value(self, name, value): """ calls appropriate routine to convert value to bytes, and save it in record """ # check nullabel here, binary is handled in the appropriate update_* functions fielddef = self._meta[name] index = fielddef[NUL] field_type = fielddef[TYPE] flags = fielddef[FLAGS] nullable = flags & NULLABLE and '_NULLFLAGS' in self._meta update = self._meta.fieldtypes[field_type]['Update'] if nullable: byte, bit = divmod(index, 8) null_def = self._meta['_NULLFLAGS'] null_data = self._data[null_def[START]:null_def[END]] if value is Null: null_data[byte] |= 1 << bit value = None else: null_data[byte] &= 0xff ^ 1 << bit self._data[null_def[START]:null_def[END]] = null_data if value is not Null: bytes = array('B', update(value, fielddef, self._meta.memo, self._meta.input_decoder, self._meta.encoder)) size = fielddef[LENGTH] if len(bytes) > size: raise DataOverflowError("tried to store %d bytes in %d byte field" % (len(bytes), size)) blank = array('B', b' ' * size) start = fielddef[START] end = start + size blank[:len(bytes)] = bytes[:] self._data[start:end] = blank[:] self._dirty = True def _update_disk(self, location='', data=None): layout = self._meta if self._recnum < 0: raise DbfError("cannot update a packed record") if layout.location == ON_DISK: header = layout.header if location == '': location = self._recnum * header.record_length + header.start if data is None: data = self._data layout.dfd.seek(location) layout.dfd.write(data) self._dirty = False table = layout.table() if table is not None: # is None when table is being destroyed for index in table._indexen: index(self) def _write(self): for field, value in self._memos.items(): self._update_field_value(field, value) self._update_disk() class RecordTemplate(object): """ Provides routines to mimic a dbf record. """ __slots__ = ('_meta', '_data', '_old_data', '_memos', '_write_to_disk', '__weakref__') def _commit_flux(self): """ Flushes field updates to disk If any errors restores previous contents and raises `DbfError` """ if self._write_to_disk: raise DbfError("record not in flux") self._memos.clear() self._old_data = None self._write_to_disk = True def _retrieve_field_value(self, name): """ Calls appropriate routine to convert value stored in field from array """ # check nullable here, binary is handled in the appropriate retrieve_* functions fielddef = self._meta[name] flags = fielddef[FLAGS] nullable = flags & NULLABLE and '_NULLFLAGS' in self._meta if nullable: index = fielddef[NUL] byte, bit = divmod(index, 8) null_def = self._meta['_NULLFLAGS'] null_data = self._data[null_def[START]:null_def[END]] if null_data[byte] >> bit & 1: return Null record_data = self._data[fielddef[START]:fielddef[END]] field_type = fielddef[TYPE] retrieve = self._meta.fieldtypes[field_type]['Retrieve'] datum = retrieve(record_data, fielddef, self._meta.memo, self._meta.decoder) return datum def _rollback_flux(self): """ discards all changes since ._start_flux() """ if self._write_to_disk: raise DbfError("template not in flux") self._data = self._old_data self._old_data = None self._memos.clear() self._write_to_disk = True def _start_flux(self): """ Allows record.field_name = ... and record[...] = ...; must use ._commit_flux() to commit changes """ if not self._write_to_disk: raise DbfError("template already in a state of flux") self._old_data = self._data[:] self._write_to_disk = False def _update_field_value(self, name, value): """ calls appropriate routine to convert value to ascii bytes, and save it in record """ # check nullabel here, binary is handled in the appropriate update_* functions fielddef = self._meta[name] index = fielddef[NUL] field_type = fielddef[TYPE] flags = fielddef[FLAGS] nullable = flags & NULLABLE and '_NULLFLAGS' in self._meta update = self._meta.fieldtypes[field_type]['Update'] if nullable: byte, bit = divmod(index, 8) null_def = self._meta['_NULLFLAGS'] null_data = self._data[null_def[START]:null_def[END]] #.tostring() if value is Null: null_data[byte] |= 1 << bit value = None else: null_data[byte] &= 0xff ^ 1 << bit self._data[null_def[START]:null_def[END]] = null_data if value is not Null: bytes = array('B', update(value, fielddef, self._meta.memo, self._meta.input_decoder, self._meta.encoder)) size = fielddef[LENGTH] if len(bytes) > size: raise DataOverflowError("tried to store %d bytes in %d byte field" % (len(bytes), size)) blank = array('B', b' ' * size) start = fielddef[START] end = start + size blank[:len(bytes)] = bytes[:] self._data[start:end] = blank[:] def __new__(cls, layout, original_record=None, defaults=None): """ record = ascii array of entire record; layout=record specification """ sig = layout.record_sig if sig not in dbf._Template_Records: table = layout.table() dbf._Template_Records[sig] = table.new( ':%s:' % layout.filename, default_data_types=table._meta._default_data_types, field_data_types=table._meta._field_data_types, on_disk=False )._meta layout = dbf._Template_Records[sig] record = object.__new__(cls) record._write_to_disk = True record._meta = layout record._memos = {} for name in layout.memofields: field_type = layout[name][TYPE] record._memos[name] = layout.fieldtypes[field_type]['Empty']() if original_record is None: record._data = layout.blankrecord[:] else: record._data = original_record._data[:] for name in layout.memofields: record._memos[name] = original_record[name] for name in field_names(defaults or {}): record[name] = defaults[name] record._old_data = record._data[:] return record def __contains__(self, key): return key in self._meta.user_fields def __eq__(self, other): if not isinstance(other, (Record, RecordTemplate, dict, tuple)): return NotImplemented if isinstance(other, (Record, RecordTemplate)): if field_names(self) != field_names(other): return False for field in self._meta.user_fields: s_value, o_value = self[field], other[field] if s_value is not o_value and s_value != o_value: return False elif isinstance(other, dict): other = dict((ensure_unicode(k).upper(), v) for k, v in other.items()) if sorted(field_names(self)) != sorted(other.keys()): return False for field in self._meta.user_fields: s_value, o_value = self[field], other[field] if s_value is not o_value and s_value != o_value: return False else: # tuple if len(self) != len(other): return False for s_value, o_value in zip(self, other): if s_value is not o_value and s_value != o_value: return False return True def __iter__(self): return (self[field] for field in self._meta.user_fields) def __getattr__(self, name): if name[0:2] == '__' and name[-2:] == '__': raise AttributeError('Method %s is not implemented.' % name) name = name.upper() if not name in self._meta.fields: raise FieldMissingError(name) if name in self._memos: return self._memos[name] try: value = self._retrieve_field_value(name) return value except DbfError: fielddef = self._meta[name] error = sys.exc_info()[1] error.message = "field --%s-- is %s -> %s" % (name, self._meta.fieldtypes[fielddef['type']]['Type'], error.message) raise def __getitem__(self, item): fields = self._meta.user_fields if isinstance(item, baseinteger): field_count = len(fields) if not -field_count <= item < field_count: raise NotFoundError("Field offset %d is not in record" % item) field = fields[item] if field in self._memos: return self._memos[field] return self[field] elif isinstance(item, slice): sequence = [] if isinstance(item.start, basestring) or isinstance(item.stop, basestring): start, stop, step = item.start.upper(), item.stop.upper(), item.step if start not in fields or stop not in fields: raise FieldMissingError("Either %r or %r (or both) are not valid field names" % (start, stop)) if step is not None and not isinstance(step, baseinteger): raise DbfError("step value must be an integer, not %r" % type(step)) start = fields.index(start) stop = fields.index(stop) + 1 item = slice(start, stop, step) for index in self._meta.fields[item]: sequence.append(self[index]) return sequence elif isinstance(item, basestring): return self.__getattr__(item.upper()) else: raise TypeError("%r is not a field name" % item) def __len__(self): return self._meta.user_field_count def __ne__(self, other): if not isinstance(other, (Record, RecordTemplate, dict, tuple)): return NotImplemented return not self == other def __setattr__(self, name, value): if name in self.__slots__: object.__setattr__(self, name, value) return name = name.upper() if not name in self._meta.fields: raise FieldMissingError(name) if name in self._meta.memofields: self._memos[name] = value return try: self._update_field_value(name, value) except DbfError: error = sys.exc_info()[1] fielddef = self._meta[name] message = "%s (%s) = %r --> %s" % (name, self._meta.fieldtypes[fielddef[TYPE]]['Type'], value, error.message) data = name err_cls = error.__class__ raise err_cls(message, data).from_exc(None) def __setitem__(self, name, value): if isinstance(name, basestring): self.__setattr__(name.upper(), value) elif isinstance(name, baseinteger): self.__setattr__(self._meta.fields[name], value) elif isinstance(name, slice): sequence = [] names = field_names(self) if isinstance(name.start, basestring) or isinstance(name.stop, basestring): start, stop, step = name.start.upper(), name.stop.upper(), name.step if start not in names or stop not in names: raise FieldMissingError("Either %r or %r (or both) are not valid field names" % (start, stop)) if step is not None and not isinstance(step, baseinteger): raise DbfError("step value must be an integer, not %r" % type(step)) start = names.index(start) stop = names.index(stop) + 1 name = slice(start, stop, step) for field in self._meta.fields[name]: sequence.append(field) if len(sequence) != len(value): raise DbfError("length of slices not equal") for field, val in zip(sequence, value): self[field] = val else: raise TypeError("%s is not a field name" % name) def __repr__(self): return '%r' % to_bytes(self._data) def __str__(self): result = [] for seq, field in enumerate(field_names(self)): result.append("%3d - %-10s: %r" % (seq, field, self[field])) return '\n'.join(result) class RecordVaporWare(object): """ Provides routines to mimic a dbf record, but all values are non-existent. """ __slots__ = ('_recno', '_sequence') def __new__(cls, position, sequence): """ record = ascii array of entire record layout=record specification memo = memo object for table """ if position not in ('bof', 'eof'): raise ValueError("position should be 'bof' or 'eof', not %r" % position) vapor = object.__new__(cls) vapor._recno = (-1, None)[position == 'eof'] vapor._sequence = sequence return vapor def __contains__(self, key): return False def __eq__(self, other): if not isinstance(other, (Record, RecordTemplate, RecordVaporWare, dict, tuple)): return NotImplemented return False def __getattr__(self, name): if name[0:2] == '__' and name[-2:] == '__': raise AttributeError('Method %s is not implemented.' % name) else: return Vapor def __getitem__(self, item): if isinstance(item, baseinteger): return Vapor elif isinstance(item, slice): raise TypeError('slice notation not allowed on Vapor records') elif isinstance(item, basestring): return self.__getattr__(item) else: raise TypeError("%r is not a field name" % item) def __len__(self): raise TypeError("Vapor records have no length") def __ne__(self, other): if not isinstance(other, (Record, RecordTemplate, RecordVaporWare, dict, tuple)): return NotImplemented return True if py_ver < (3, 0): def __nonzero__(self): """ Vapor records are always False """ return False else: def __bool__(self): """ Vapor records are always False """ return False def __setattr__(self, name, value): if name in self.__slots__: object.__setattr__(self, name, value) return raise TypeError("cannot change Vapor record") def __setitem__(self, name, value): if isinstance(name, (basestring, baseinteger)): raise TypeError("cannot change Vapor record") elif isinstance(name, slice): raise TypeError("slice notation not allowed on Vapor records") else: raise TypeError("%s is not a field name" % name) def __repr__(self): return "RecordVaporWare(position=%r, sequence=%r)" % (('bof', 'eof')[recno(self) is None], self._sequence) def __str__(self): return 'VaporRecord(%r)' % recno(self) @property def _recnum(self): if self._recno is None: return len(self._sequence) else: return self._recno class _DbfMemo(object): """ Provides access to memo fields as dictionaries Must override _init, _get_memo, and _put_memo to store memo contents to disk """ def _init(self): """ Initialize disk file usage """ def _get_memo(self, block): """ Retrieve memo contents from disk """ def _put_memo(self, data): """ Store memo contents to disk """ def _zap(self): """ Resets memo structure back to zero memos """ self.memory.clear() self.nextmemo = 1 def __init__(self, meta): self.meta = meta self.memory = {} self.nextmemo = 1 self._init() self.meta.newmemofile = False def get_memo(self, block): """ Gets the memo in block """ if self.meta.ignorememos or not block: return '' if self.meta.location == ON_DISK: return self._get_memo(block) else: return self.memory[block] def put_memo(self, data): """ Stores data in memo file, returns block number """ if self.meta.ignorememos or data == '': return 0 if self.meta.location == IN_MEMORY: thismemo = self.nextmemo self.nextmemo += 1 self.memory[thismemo] = data else: thismemo = self._put_memo(data) return thismemo class _Db3Memo(_DbfMemo): """ dBase III specific """ def _init(self): self.meta.memo_size= 512 self.record_header_length = 2 if self.meta.location == ON_DISK and not self.meta.ignorememos: if self.meta.newmemofile: self.meta.mfd = open(self.meta.memoname, 'w+b') self.meta.mfd.write(pack_long_int(1) + b'\x00' * 508) else: mode = ('rb', 'r+b')[self.meta.status is READ_WRITE] try: self.meta.mfd = open(self.meta.memoname, mode) self.meta.mfd.seek(0) next = self.meta.mfd.read(4) self.nextmemo = unpack_long_int(next) except Exception: exc = sys.exc_info()[1] raise DbfError("memo file appears to be corrupt: %r" % exc.args).from_exc(None) def _get_memo(self, block): block = int(block) self.meta.mfd.seek(block * self.meta.memo_size) eom = -1 data = b'' while eom == -1: newdata = self.meta.mfd.read(self.meta.memo_size) if not newdata: return data data += newdata eom = data.find(b'\x1a\x1a') return data[:eom] def _put_memo(self, data): data = data length = len(data) + self.record_header_length # room for two ^Z at end of memo blocks = length // self.meta.memo_size if length % self.meta.memo_size: blocks += 1 thismemo = self.nextmemo self.nextmemo = thismemo + blocks self.meta.mfd.seek(0) self.meta.mfd.write(pack_long_int(self.nextmemo)) self.meta.mfd.seek(thismemo * self.meta.memo_size) self.meta.mfd.write(data) self.meta.mfd.write(b'\x1a\x1a') double_check = self._get_memo(thismemo) if len(double_check) != len(data): uhoh = open('dbf_memo_dump.err', 'wb') uhoh.write('thismemo: %d' % thismemo) uhoh.write('nextmemo: %d' % self.nextmemo) uhoh.write('saved: %d bytes' % len(data)) uhoh.write(data) uhoh.write('retrieved: %d bytes' % len(double_check)) uhoh.write(double_check) uhoh.close() raise DbfError("unknown error: memo not saved") return thismemo def _zap(self): if self.meta.location == ON_DISK and not self.meta.ignorememos: mfd = self.meta.mfd mfd.seek(0) mfd.truncate(0) mfd.write(pack_long_int(1) + b'\x00' * 508) mfd.flush() class _VfpMemo(_DbfMemo): """ Visual Foxpro 6 specific """ def _init(self): if self.meta.location == ON_DISK and not self.meta.ignorememos: self.record_header_length = 8 if self.meta.newmemofile: if self.meta.memo_size == 0: self.meta.memo_size = 1 elif 1 < self.meta.memo_size < 33: self.meta.memo_size *= 512 self.meta.mfd = open(self.meta.memoname, 'w+b') nextmemo = 512 // self.meta.memo_size if nextmemo * self.meta.memo_size < 512: nextmemo += 1 self.nextmemo = nextmemo self.meta.mfd.write(pack_long_int(nextmemo, bigendian=True) + b'\x00\x00' + \ pack_short_int(self.meta.memo_size, bigendian=True) + b'\x00' * 504) else: mode = ('rb', 'r+b')[self.meta.status is READ_WRITE] try: self.meta.mfd = open(self.meta.memoname, mode) self.meta.mfd.seek(0) header = self.meta.mfd.read(512) self.nextmemo = unpack_long_int(header[:4], bigendian=True) self.meta.memo_size = unpack_short_int(header[6:8], bigendian=True) except Exception: exc = sys.exc_info()[1] raise DbfError("memo file appears to be corrupt: %r" % exc.args).from_exc(None) def _get_memo(self, block): self.meta.mfd.seek(block * self.meta.memo_size) header = self.meta.mfd.read(8) length = unpack_long_int(header[4:], bigendian=True) return self.meta.mfd.read(length) def _put_memo(self, data): data = data self.meta.mfd.seek(0) thismemo = unpack_long_int(self.meta.mfd.read(4), bigendian=True) self.meta.mfd.seek(0) length = len(data) + self.record_header_length blocks = length // self.meta.memo_size if length % self.meta.memo_size: blocks += 1 self.meta.mfd.write(pack_long_int(thismemo + blocks, bigendian=True)) self.meta.mfd.seek(thismemo * self.meta.memo_size) self.meta.mfd.write(b'\x00\x00\x00\x01' + pack_long_int(len(data), bigendian=True) + data) return thismemo def _zap(self): if self.meta.location == ON_DISK and not self.meta.ignorememos: mfd = self.meta.mfd mfd.seek(0) mfd.truncate(0) nextmemo = 512 // self.meta.memo_size if nextmemo * self.meta.memo_size < 512: nextmemo += 1 self.nextmemo = nextmemo mfd.write(pack_long_int(nextmemo, bigendian=True) + b'\x00\x00' + \ pack_short_int(self.meta.memo_size, bigendian=True) + b'\x00' * 504) mfd.flush() class DbfCsv(csv.Dialect): """ csv format for exporting tables """ delimiter = ',' doublequote = True escapechar = None lineterminator = '\n' quotechar = '"' skipinitialspace = True quoting = csv.QUOTE_NONNUMERIC csv.register_dialect('dbf', DbfCsv) class _DeadObject(object): """ used because you cannot weakref None """ if py_ver < (3, 0): def __nonzero__(self): return False else: def __bool__(self): return False _DeadObject = _DeadObject() # Routines for saving, retrieving, and creating fields VFPTIME = 1721425 def pack_short_int(value, bigendian=False): """ Returns a two-bye integer from the value, or raises DbfError """ # 256 / 65,536 if value > 65535: raise DataOverflowError("Maximum Integer size exceeded. Possible: 65535. Attempted: %d" % value) if bigendian: return struct.pack('>H', value) else: return struct.pack(' 4294967295: raise DataOverflowError("Maximum Integer size exceeded. Possible: 4294967295. Attempted: %d" % value) if bigendian: return struct.pack('>L', value) else: return struct.pack(' 10: raise DbfError("Maximum string size is ten characters -- %s has %d characters" % (string, len(string))) return struct.pack('11s', string.upper()) def unpack_short_int(bytes, bigendian=False): """ Returns the value in the two-byte integer passed in """ if bigendian: return struct.unpack('>H', bytes)[0] else: return struct.unpack('L', bytes)[0]) else: return int(struct.unpack(' maxintegersize: if integersize != 1: raise DataOverflowError('Integer portion too big') string = scinot(value, decimalsize) if len(string) > totalsize: raise DataOverflowError('Value representation too long for field') return ("%*.*f" % (fielddef[LENGTH], fielddef[DECIMALS], value)).encode('ascii') def retrieve_vfp_datetime(bytes, fielddef, *ignore): """ returns the date/time stored in bytes; dates <= 01/01/1981 00:00:00 may not be accurate; BC dates are nulled. """ # two four-byte integers store the date and time. # millesecords are discarded from time if bytes == array('B', [0] * 8): cls = fielddef[EMPTY] if cls is NoneType: return None return cls() cls = fielddef[CLASS] time = unpack_long_int(bytes[4:]) microseconds = (time % 1000) * 1000 time = time // 1000 # int(round(time, -3)) // 1000 discard milliseconds hours = time // 3600 mins = time % 3600 // 60 secs = time % 3600 % 60 time = datetime.time(hours, mins, secs, microseconds) possible = unpack_long_int(bytes[:4]) possible -= VFPTIME possible = max(0, possible) date = datetime.date.fromordinal(possible) return cls(date.year, date.month, date.day, time.hour, time.minute, time.second, time.microsecond) def update_vfp_datetime(moment, *ignore): """ Sets the date/time stored in moment moment must have fields: year, month, day, hour, minute, second, microsecond """ data = [0] * 8 if moment: hour = moment.hour minute = moment.minute second = moment.second millisecond = moment.microsecond // 1000 # convert from millionths to thousandths time = ((hour * 3600) + (minute * 60) + second) * 1000 + millisecond data[4:] = update_integer(time) data[:4] = update_integer(moment.toordinal() + VFPTIME) return to_bytes(data) def retrieve_clp_timestamp(bytes, fielddef, *ignore): """ returns the timestamp stored in bytes """ # First long repesents date and second long time. # Date is the number of days since January 1st, 4713 BC. # Time is hours * 3600000L + minutes * 60000L + seconds * 1000L # http://www.manmrk.net/tutorials/database/xbase/data_types.html if bytes == array('B', [0] * 8): cls = fielddef[EMPTY] if cls is NoneType: return None return cls() cls = fielddef[CLASS] days = unpack_long_int(bytes[:4]) # how many days between -4713-01-01 and 0001-01-01 ? going to guess 1,721,425 BC = 1721425 if days < BC: # bail cls = fielddef[EMPTY] if cls is NoneType: return None return cls() date = datetime.date.fromordinal(days-BC) time = unpack_long_int(bytes[4:]) microseconds = (time % 1000) * 1000 time = time // 1000 # int(round(time, -3)) // 1000 discard milliseconds hours = time // 3600 mins = time % 3600 // 60 secs = time % 3600 % 60 time = datetime.time(hours, mins, secs, microseconds) return cls(date.year, date.month, date.day, time.hour, time.minute, time.second, time.microsecond) def update_clp_timestamp(moment, *ignore): """ Sets the timestamp stored in moment moment must have fields: year, month, day, hour, minute, second, microsecond """ data = [0] * 8 if moment: BC = 1721425 days = BC + moment.toordinal() hour = moment.hour minute = moment.minute second = moment.second millisecond = moment.microsecond // 1000 # convert from millionths to thousandths time = ((hour * 3600) + (minute * 60) + second) * 1000 + millisecond data[:4] = pack_long_int(days) data[4:] = pack_long_int(time) return to_bytes(data) def retrieve_vfp_memo(bytes, fielddef, memo, decoder): """ Returns the block of data from a memo file """ if memo is None: block = 0 else: block = struct.unpack(' 1 or format[0][0] != '(' or format[0][-1] != ')' or any(f not in flags for f in format[1:]): raise FieldSpecError("Format for Numeric field creation is 'N(s,d)%s', not 'N%s'" % field_spec_error_text(format, flags)) length, decimals = format[0][1:-1].split(',') length = int(length) decimals = int(decimals) flag = 0 for f in format[1:]: flag |= FieldFlag.lookup(f) if not 0 < length <= 20: raise FieldSpecError("Numeric fields must be between 1 and 20 digits, not %d" % length) if decimals and not 0 < decimals <= length - 2: raise FieldSpecError("Decimals must be between 0 and Length-2 (Length: %d, Decimals: %d)" % (length, decimals)) return length, decimals, flag def add_clp_character(format, flags): if format[0][0] != '(' or format[0][-1] != ')' or any([f not in flags for f in format[1:]]): raise FieldSpecError("Format for Character field creation is 'C(n)%s', not 'C%s'" % field_spec_error_text(format, flags)) length = int(format[0][1:-1]) if not 0 < length < 65519: raise FieldSpecError("Character fields must be between 1 and 65,519") decimals = 0 flag = 0 for f in format[1:]: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_character(format, flags): if format[0][0] != '(' or format[0][-1] != ')' or any([f not in flags for f in format[1:]]): raise FieldSpecError("Format for Character field creation is 'C(n)%s', not 'C%s'" % field_spec_error_text(format, flags)) length = int(format[0][1:-1]) if not 0 < length < 255: raise FieldSpecError("Character fields must be between 1 and 255") decimals = 0 flag = 0 for f in format[1:]: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_currency(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for Currency field creation is 'Y%s', not 'Y%s'" % field_spec_error_text(format, flags)) length = 8 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_datetime(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for DateTime field creation is 'T%s', not 'T%s'" % field_spec_error_text(format, flags)) length = 8 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_double(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for Double field creation is 'B%s', not 'B%s'" % field_spec_error_text(format, flags)) length = 8 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_integer(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for Integer field creation is 'I%s', not 'I%s'" % field_spec_error_text(format, flags)) length = 4 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) return length, decimals, flag def add_vfp_memo(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for Memo field creation is 'M%s', not 'M%s'" % field_spec_error_text(format, flags)) length = 4 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) if 'BINARY' not in flags: # general or picture -- binary is implied flag |= FieldFlag.BINARY return length, decimals, flag def add_vfp_binary_memo(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for Memo field creation is 'M%s', not 'M%s'" % field_spec_error_text(format, flags)) length = 4 decimals = 0 flag = 0 for f in format: flag |= FieldFlag.lookup(f) # general or picture -- binary is implied flag |= FieldFlag.BINARY return length, decimals, flag def add_vfp_numeric(format, flags): if format[0][0] != '(' or format[0][-1] != ')' or any(f not in flags for f in format[1:]): raise FieldSpecError("Format for Numeric field creation is 'N(s,d)%s', not 'N%s'" % field_spec_error_text(format, flags)) length, decimals = format[0][1:-1].split(',') length = int(length) decimals = int(decimals) flag = 0 for f in format[1:]: flag |= FieldFlag.lookup(f) if not 0 < length < 21: raise FieldSpecError("Numeric fields must be between 1 and 20 digits, not %d" % length) if decimals and not 0 < decimals <= length - 2: raise FieldSpecError("Decimals must be between 0 and Length-2 (Length: %d, Decimals: %d)" % (length, decimals)) return length, decimals, flag def add_clp_timestamp(format, flags): if any(f not in flags for f in format): raise FieldSpecError("Format for TimeStamp field creation is '@%s', not '@%s'" % field_spec_error_text(format, flags)) length = 8 decimals = 0 flag = 0 for f in format[1:]: flag |= FieldFlag.lookup(f) return length, decimals, flag def field_spec_error_text(format, flags): """ generic routine for error text for the add...() functions """ flg = '' if flags: flg = ' [ ' + ' | '.join(flags) + ' ]' frmt = '' if format: frmt = ' ' + ' '.join(format) return flg, frmt def ezip(*iters): """ extends all iters to longest one, using last value from each as necessary """ iters = [iter(x) for x in iters] last = [None] * len(iters) while "any iters have items left": alive = len(iters) for i, iterator in enumerate(iters): try: value = next(iterator) last[i] = value except StopIteration: alive -= 1 if alive: yield tuple(last) alive = len(iters) continue break def unicode_error_handler(decoder, encoder, errors): if errors in ('ignore', 'replace'): decoder = partial(decoder, errors=errors) encoder = partial(encoder, errors=errors) elif errors in ('xmlcharrefreplace', 'backslashreplace'): decoder = partial(decoder, errors='replace') encoder = partial(encoder, errors=errors) return decoder, encoder # Public classes class Tables(object): """ context manager for multiple tables and/or indices """ def __init__(self, *tables): if len(tables) == 1 and not isinstance(tables[0], (Table, basestring)): tables = tables[0] self._tables = [] self._entered = [] for table in tables: if isinstance(table, basestring): table = Table(table) self._tables.append(table) def __enter__(self): for table in self._tables: table.__enter__() self._entered.append(table) return tuple(self._tables) def __exit__(self, *args): while self._entered: table = self._entered.pop() try: table.__exit__() except Exception: pass class Index(_Navigation): """ non-persistent index for a table """ def __init__(self, table, key): self._table = table self._values = [] # ordered list of values self._rec_by_val = [] # matching record numbers self._records = {} # record numbers:values self.__doc__ = key.__doc__ or 'unknown' self._key = key self._previous_status = [] for record in table: value = key(record) if value is DoNotIndex: continue rec_num = recno(record) if not isinstance(value, tuple): value = (value, ) vindex = bisect_right(self._values, value) self._values.insert(vindex, value) self._rec_by_val.insert(vindex, rec_num) self._records[rec_num] = value table._indexen.add(self) def __call__(self, record): rec_num = recno(record) key = self.key(record) if rec_num in self._records: if self._records[rec_num] == key: return old_key = self._records[rec_num] vindex = bisect_left(self._values, old_key) self._values.pop(vindex) self._rec_by_val.pop(vindex) del self._records[rec_num] assert rec_num not in self._records if key == (DoNotIndex, ): return vindex = bisect_right(self._values, key) self._values.insert(vindex, key) self._rec_by_val.insert(vindex, rec_num) self._records[rec_num] = key def __contains__(self, data): if not isinstance(data, (Record, RecordTemplate, tuple, dict)): raise TypeError("%r is not a record, templace, tuple, nor dict" % (data, )) try: value = self.key(data) return value in self._values except Exception: for record in self: if record == data: return True return False def __getitem__(self, key): '''if key is an integer, returns the matching record; if key is a [slice | string | tuple | record] returns a List; raises NotFoundError on failure''' if isinstance(key, baseinteger): count = len(self._values) if not -count <= key < count: raise NotFoundError("Record %d is not in list." % key) rec_num = self._rec_by_val[key] return self._table[rec_num] elif isinstance(key, slice): result = List() start, stop, step = key.start, key.stop, key.step if start is None: start = 0 if stop is None: stop = len(self._rec_by_val) if step is None: step = 1 if step < 0: start, stop = stop - 1, -(stop - start + 1) for loc in range(start, stop, step): record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) return result elif isinstance (key, (basestring, tuple, Record, RecordTemplate)): if isinstance(key, (Record, RecordTemplate)): key = self.key(key) elif isinstance(key, basestring): key = (key, ) lo = self._search(key, where='left') hi = self._search(key, where='right') if lo == hi: raise NotFoundError(key) result = List(desc='match = %r' % (key, )) for loc in range(lo, hi): record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) return result else: raise TypeError('indices must be integers, match objects must by strings or tuples') def __enter__(self): self._table.__enter__() return self def __exit__(self, *exc_info): self._table.__exit__() return False def __iter__(self): return Iter(self) def __len__(self): return len(self._records) def _clear(self): """ removes all entries from index """ self._values[:] = [] self._rec_by_val[:] = [] self._records.clear() def _key(self, record): """ table_name, record_number """ self._still_valid_check() return source_table(record), recno(record) def _nav_check(self): """ raises error if table is closed """ if self._table._meta.status == CLOSED: raise DbfError('indexed table %s is closed' % self.filename) def _partial_match(self, target, match): target = target[:len(match)] if isinstance(match[-1], basestring): target = list(target) target[-1] = target[-1][:len(match[-1])] target = tuple(target) return target == match def _purge(self, rec_num): value = self._records.get(rec_num) if value is not None: vindex = bisect_left(self._values, value) del self._records[rec_num] self._values.pop(vindex) self._rec_by_val.pop(vindex) def _reindex(self): """ reindexes all records """ for record in self._table: self(record) def _search(self, match, lo=0, hi=None, where=None): if hi is None: hi = len(self._values) if where == 'left': return bisect_left(self._values, match, lo, hi) elif where == 'right': return bisect_right(self._values, match, lo, hi) def index(self, record, start=None, stop=None): """ returns the index of record between start and stop start and stop default to the first and last record """ if not isinstance(record, (Record, RecordTemplate, dict, tuple)): raise TypeError("x should be a record, template, dict, or tuple, not %r" % type(record)) self._nav_check() if start is None: start = 0 if stop is None: stop = len(self) for i in range(start, stop): if record == (self[i]): return i else: raise NotFoundError("dbf.Index.index(x): x not in Index", data=record) def index_search(self, match, start=None, stop=None, nearest=False, partial=False): """ returns the index of match between start and stop start and stop default to the first and last record. if nearest is true returns the location of where the match should be otherwise raises NotFoundError """ self._nav_check() if not isinstance(match, tuple): match = (match, ) if start is None: start = 0 if stop is None: stop = len(self) loc = self._search(match, start, stop, where='left') if loc == len(self._values): if nearest: return IndexLocation(loc, False) raise NotFoundError("dbf.Index.index_search(x): x not in index", data=match) if self._values[loc] == match \ or partial and self._partial_match(self._values[loc], match): return IndexLocation(loc, True) elif nearest: return IndexLocation(loc, False) else: raise NotFoundError("dbf.Index.index_search(x): x not in Index", data=match) def key(self, record): result = self._key(record) if not isinstance(result, tuple): result = (result, ) return result def query(self, criteria): """ criteria is a callback that returns a truthy value for matching record """ self._nav_check() return pqlc(self, criteria) def search(self, match, partial=False): """ returns dbf.List of all (partially) matching records """ self._nav_check() result = List() if not isinstance(match, tuple): match = (match, ) loc = self._search(match, where='left') if loc == len(self._values): return result while loc < len(self._values) and self._values[loc] == match: record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) loc += 1 if partial: while loc < len(self._values) and self._partial_match(self._values[loc], match): record = self._table[self._rec_by_val[loc]] result._maybe_add(item=(self._table, self._rec_by_val[loc], result.key(record))) loc += 1 return result class Relation(object): """ establishes a relation between two dbf tables (not persistent) """ relations = {} def __new__(cls, src, tgt, src_names=None, tgt_names=None): if (len(src) != 2 or len(tgt) != 2): raise DbfError("Relation should be called with ((src_table, src_field), (tgt_table, tgt_field))") if src_names and len(src_names) !=2 or tgt_names and len(tgt_names) != 2: raise DbfError('src_names and tgt_names, if specified, must be ("table","field")') src_table, src_field = src tgt_table, tgt_field = tgt try: if isinstance(src_field, baseinteger): table, field = src_table, src_field src_field = table.field_names[field] else: src_table.field_names.index(src_field) if isinstance(tgt_field, baseinteger): table, field = tgt_table, tgt_field tgt_field = table.field_names[field] else: tgt_table.field_names.index(tgt_field) except (IndexError, ValueError): raise DbfError('%r not in %r' % (field, table)).from_exc(None) if src_names: src_table_name, src_field_name = src_names else: src_table_name, src_field_name = src_table.filename, src_field if src_table_name[-4:].lower() == '.dbf': src_table_name = src_table_name[:-4] if tgt_names: tgt_table_name, tgt_field_name = tgt_names else: tgt_table_name, tgt_field_name = tgt_table.filename, tgt_field if tgt_table_name[-4:].lower() == '.dbf': tgt_table_name = tgt_table_name[:-4] relation = cls.relations.get(((src_table, src_field), (tgt_table, tgt_field))) if relation is not None: return relation obj = object.__new__(cls) obj._src_table, obj._src_field = src_table, src_field obj._tgt_table, obj._tgt_field = tgt_table, tgt_field obj._src_table_name, obj._src_field_name = src_table_name, src_field_name obj._tgt_table_name, obj._tgt_field_name = tgt_table_name, tgt_field_name obj._tables = dict() cls.relations[((src_table, src_field), (tgt_table, tgt_field))] = obj return obj def __eq__(self, other): if (self.src_table == other.src_table and self.src_field == other.src_field and self.tgt_table == other.tgt_table and self.tgt_field == other.tgt_field): return True return False def __getitem__(self, record): """ record should be from the source table """ key = (record[self.src_field], ) try: return self.index[key] except NotFoundError: return List(desc='%s not found' % key) def __hash__(self): return hash((self.src_table, self.src_field, self.tgt_table, self.tgt_field)) def __ne__(self, other): if (self.src_table != other.src_table or self.src_field != other.src_field or self.tgt_table != other.tgt_table or self.tgt_field != other.tgt_field): return True return False def __repr__(self): return "Relation((%r, %r), (%r, %r))" % (self.src_table_name, self.src_field, self.tgt_table_name, self.tgt_field) def __str__(self): return "%s:%s --> %s:%s" % (self.src_table_name, self.src_field_name, self.tgt_table_name, self.tgt_field_name) @property def src_table(self): "name of source table" return self._src_table @property def src_field(self): "name of source field" return self._src_field @property def src_table_name(self): return self._src_table_name @property def src_field_name(self): return self._src_field_name @property def tgt_table(self): "name of target table" return self._tgt_table @property def tgt_field(self): "name of target field" return self._tgt_field @property def tgt_table_name(self): return self._tgt_table_name @property def tgt_field_name(self): return self._tgt_field_name @LazyAttr def index(self): def index(record, field=self._tgt_field): return record[field] index.__doc__ = "%s:%s --> %s:%s" % (self.src_table_name, self.src_field_name, self.tgt_table_name, self.tgt_field_name) self.index = self._tgt_table.create_index(index) source = List(self._src_table, key=lambda rec, field=self._src_field: rec[field]) target = List(self._tgt_table, key=lambda rec, field=self._tgt_field: rec[field]) if len(source) != len(self._src_table): self._tables[self._src_table] = 'many' else: self._tables[self._src_table] = 'one' if len(target) != len(self._tgt_table): self._tables[self._tgt_table] = 'many' else: self._tables[self._tgt_table] = 'one' return self.index def one_or_many(self, table): self.index # make sure self._tables has been populated try: if isinstance(table, basestring): table = (self._src_table, self._tgt_table)[self._tgt_table_name == table] return self._tables[table] except IndexError: raise NotFoundError("table %s not in relation" % table).from_exc(None) class IndexLocation(long): """ Represents the index where the match criteria is if True, or would be if False Used by Index.index_search """ def __new__(cls, value, found): "value is the number, found is True/False" result = long.__new__(cls, value) result.found = found return result if py_ver < (3, 0): def __nonzero__(self): return self.found else: def __bool__(self): return self.found class FieldInfo(NamedTuple): """ tuple with named attributes for representing a field's dbf type, length, decimal portion, and python class """ field_type = 0, "dbf field type (C, N, D, etc.)" length = 1, "overall length of field" decimal = 2, "number of decimal places for numeric fields" py_type = 3, "Python class for this field (Char, Logical, default, etc.)" class CodePage(NamedTuple): """ tuple with named attributes for representing a tables codepage """ def __new__(cls, name): "call with name of codepage (e.g. 'cp1252')" code, name, desc = _codepage_lookup(name) return tuple.__new__(cls, (name, desc, code)) def __repr__(self): return "CodePage(%r, %r, %02x)" % self def __str__(self): return "%s (%s)" % (self[0], self[1]) name = 0, "name of code page" desc = 1, "description of code page" code = 2, "numeric code of code page" class Iter(_Navigation): """ Provides iterable behavior for a table """ def __init__(self, table, include_vapor=False): """ Return a Vapor record as the last record in the iteration if include_vapor is True """ self._table = table self._record = None self._include_vapor = include_vapor self._exhausted = False def __iter__(self): return self if py_ver < (3, 0): def next(self): while not self._exhausted: if self._index == len(self._table): break if self._index >= (len(self._table) - 1): self._index = max(self._index, len(self._table)) if self._include_vapor: return RecordVaporWare('eof', self._table) break self._index += 1 record = self._table[self._index] return record self._exhausted = True raise StopIteration else: def __next__(self): while not self._exhausted: if self._index == len(self._table): break if self._index >= (len(self._table) - 1): self._index = max(self._index, len(self._table)) if self._include_vapor: return RecordVaporWare('eof', self._table) break self._index += 1 record = self._table[self._index] return record self._exhausted = True raise StopIteration class Table(_Navigation): """ Base class for dbf style tables """ _version = 'basic memory table' _versionabbr = 'dbf' _max_fields = 255 _max_records = 4294967296 @MutableDefault def _field_types(): return { CHAR: { 'Type':'Character', 'Init':add_character, 'Blank':lambda x: b' ' * x, 'Retrieve':retrieve_character, 'Update':update_character, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, DATE: { 'Type':'Date', 'Init':add_date, 'Blank':lambda x: b' ', 'Retrieve':retrieve_date, 'Update':update_date, 'Class':datetime.date, 'Empty':none, 'flags':tuple(), }, NUMERIC: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, LOGICAL: { 'Type':'Logical', 'Init':add_logical, 'Blank':lambda x: b'?', 'Retrieve':retrieve_logical, 'Update':update_logical, 'Class':bool, 'Empty':none, 'flags':tuple(), }, MEMO: { 'Type':'Memo', 'Init':add_memo, 'Blank':lambda x: b' ', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, FLOAT: { 'Type':'Numeric', 'Init':add_numeric, 'Blank':lambda x: b' ' * x, 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, } @MutableDefault def _previous_status(): return [] _memoext = '' _memoClass = _DbfMemo _yesMemoMask = 0 _noMemoMask = 0 _binary_types = tuple() # as in non-unicode character, or non-text number _character_types = (CHAR, DATE, FLOAT, LOGICAL, MEMO, NUMERIC) # field represented by text data _currency_types = tuple() # money! _date_types = (DATE, ) # dates _datetime_types = tuple() # dates w/times _decimal_types = (NUMERIC, FLOAT) # text-based numeric fields _fixed_types = (MEMO, DATE, LOGICAL) # always same length in table _logical_types = (LOGICAL, ) # logicals _memo_types = (MEMO, ) _numeric_types = (NUMERIC, FLOAT) # fields representing a number _variable_types = (CHAR, NUMERIC, FLOAT) # variable length in table _dbfTableHeader = array('B', [0] * 32) _dbfTableHeader[0] = 0 # table type - none _dbfTableHeader[8:10] = array('B', pack_short_int(33)) _dbfTableHeader[10] = 1 # record length -- one for delete flag _dbfTableHeader[29] = 0 # code page -- none, using plain ascii # _dbfTableHeader = to_bytes(_dbfTableHeader) _dbfTableHeaderExtra = b'' _supported_tables = () _pack_count = 0 backup = None class _Indexen(object): """ implements the weakref structure for seperate indexes """ def __init__(self): self._indexen = set() def __iter__(self): self._indexen = set([s for s in self._indexen if s() is not None]) return (s() for s in self._indexen if s() is not None) def __len__(self): self._indexen = set([s for s in self._indexen if s() is not None]) return len(self._indexen) def add(self, new_index): self._indexen.add(weakref.ref(new_index)) self._indexen = set([s for s in self._indexen if s() is not None]) class _MetaData(dict): """ Container class for storing per table metadata """ blankrecord = None codepage = None # code page being used (can be overridden when table is opened) dfd = None # file handle fields = None # field names field_count = 0 # number of fields field_types = None # dictionary of dbf type field specs filename = None # name of .dbf file ignorememos = False # True when memos should be ignored memoname = None # name of .dbt/.fpt file memo_size = None # size of blocks in memo file mfd = None # file handle memo = None # memo object memofields = None # field names of Memo type newmemofile = False # True when memo file needs to be created nulls = None # non-None when Nullable fields present user_fields = None # not counting SYSTEM fields user_field_count = 0 # also not counting SYSTEM fields unicode_errors = 'strict' # default to strict unicode translations status = CLOSED # until we open it class _TableHeader(object): """ represents the data block that defines a tables type and layout """ def __init__(self, data, pack_date, unpack_date): if len(data) != 32: raise BadDataError('table header should be 32 bytes, but is %d bytes' % len(data)) self.packDate = pack_date self.unpackDate = unpack_date self._data = array('B', data + CR) def codepage(self, cp=None): """ get/set code page of table """ if cp is None: return self._data[29] else: cp, sd, ld = _codepage_lookup(cp) self._data[29] = cp return cp @property def data(self): """ main data structure """ date = self.packDate(Date.today()) self._data[1:4] = array('B', date) return self._data # return to_bytes(self._data) @data.setter def data(self, bytes): if len(bytes) < 32: raise BadDataError("length for data of %d is less than 32" % len(bytes)) self._data[:] = array('B', bytes) @property def extra(self): "extra dbf info (located after headers, before data records)" fieldblock = self._data[32:] for i in range(len(fieldblock) // 32 + 1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") cr += 33 # skip past CR return self._data[cr:] # return to_bytes(self._data[cr:]) @extra.setter def extra(self, data): data = array('B', data) fieldblock = self._data[32:] for i in range(len(fieldblock) // 32 + 1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") cr += 33 # skip past CR self._data[cr:] = data self._data[8:10] = array('B', pack_short_int(len(self._data))) # start @property def field_count(self): "number of fields (read-only)" fieldblock = self._data[32:] for i in range(len(fieldblock) // 32 + 1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") return len(fieldblock[:cr]) // 32 @property def fields(self): """ field block structure """ fieldblock = self._data[32:] for i in range(len(fieldblock) // 32 + 1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") return fieldblock[:cr] # return to_bytes(fieldblock[:cr]) @fields.setter def fields(self, block): if isinstance(block, bytes): block = array('B', block) fieldblock = self._data[32:] for i in range(len(fieldblock) // 32 + 1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") cr += 32 # convert to indexing main structure fieldlen = len(block) if fieldlen % 32 != 0: raise BadDataError("fields structure corrupt: %d is not a multiple of 32" % fieldlen) self._data[32:cr] = array('B', block) # fields self._data[8:10] = array('B', pack_short_int(len(self._data))) # start fieldlen = fieldlen // 32 recordlen = 1 # deleted flag for i in range(fieldlen): recordlen += block[i*32+16] self._data[10:12] = array('B', pack_short_int(recordlen)) @property def record_count(self): """ number of records (maximum 16,777,215) """ return unpack_long_int(to_bytes(self._data[4:8])) @record_count.setter def record_count(self, count): self._data[4:8] = array('B', pack_long_int(count)) @property def record_length(self): """ length of a record (read_only) (max of 65,535) """ return unpack_short_int(to_bytes(self._data[10:12])) @record_length.setter def record_length(self, length): """ to support Clipper large Character fields """ self._data[10:12] = array('B', pack_short_int(length)) @property def start(self): """ starting position of first record in file (must be within first 64K) """ return unpack_short_int(to_bytes(self._data[8:10])) @start.setter def start(self, pos): self._data[8:10] = array('B', pack_short_int(pos)) @property def update(self): """ date of last table modification (read-only) """ return self.unpackDate(to_bytes(self._data[1:4])) @property def version(self): """ dbf version """ return self._data[0] @version.setter def version(self, ver): self._data[0] = ver class _Table(object): """ implements the weakref table for records """ def __init__(self, count, meta): self._meta = meta self._max_count = count self._weakref_list = {} self._accesses = 0 self._dead_check = 1024 def __getitem__(self, index): # maybe = self._weakref_list[index]() if index < 0: if self._max_count + index < 0: raise IndexError('index %d smaller than available records' % index) index = self._max_count + index if index >= self._max_count: raise IndexError('index %d greater than available records' % index) maybe = self._weakref_list.get(index) if maybe: maybe = maybe() self._accesses += 1 if self._accesses >= self._dead_check: for key, value in list(self._weakref_list.items()): if value() is None: del self._weakref_list[key] if not maybe: meta = self._meta if meta.status == CLOSED: raise DbfError("%s is closed; record %d is unavailable" % (meta.filename, index)) header = meta.header if index < 0: index += header.record_count size = header.record_length location = index * size + header.start meta.dfd.seek(location) if meta.dfd.tell() != location: raise ValueError("unable to seek to offset %d in file" % location) bytes = meta.dfd.read(size) if not bytes: raise ValueError("unable to read record data from %s at location %d" % (meta.filename, location)) maybe = Record(recnum=index, layout=meta, kamikaze=bytes, _fromdisk=True) self._weakref_list[index] = weakref.ref(maybe) return maybe def append(self, record): self._weakref_list[self._max_count] = weakref.ref(record) self._max_count += 1 def clear(self): for key in list(self._weakref_list.keys()): del self._weakref_list[key] self._max_count = 0 def flush(self): for maybe in self._weakref_list.values(): maybe = maybe() if maybe and not maybe._write_to_disk: raise DbfError("some records have not been written to disk") def pop(self): if not self._max_count: raise IndexError('no records exist') self._max_count -= 1 record = self._weakref_list[self._max_count] del self._weakref_list[self._max_count] return record def _build_header_fields(self): """ constructs fieldblock for disk table """ fieldblock = array('B', b'') memo = False nulls = False meta = self._meta header = meta.header if self._yesMemoMask <= 0x80: header.version = header.version & self._noMemoMask else: header.version = self._noMemoMask meta.fields = [f for f in meta.fields if f != '_NULLFLAGS'] for field in meta.fields: layout = meta[field] if meta.fields.count(field) > 1: raise BadDataError("corrupted field structure (noticed in _build_header_fields)") fielddef = array('B', [0] * 32) fielddef[:11] = array('B', pack_str(meta.encoder(field)[0])) fielddef[11] = layout[TYPE] fielddef[12:16] = array('B', pack_long_int(layout[START])) fielddef[16] = layout[LENGTH] fielddef[17] = layout[DECIMALS] fielddef[18] = layout[FLAGS] fieldblock.extend(fielddef) if layout[TYPE] in meta.memo_types: memo = True if layout[FLAGS] & NULLABLE: nulls += 1 if memo: if self._yesMemoMask <= 0x80: header.version = header.version | self._yesMemoMask else: header.version = self._yesMemoMask if meta.memo is None: meta.memo = self._memoClass(meta) else: if os.path.exists(meta.memoname): if meta.mfd is not None: meta.mfd.close() os.remove(meta.memoname) meta.memo = None if nulls: start = layout[START] + layout[LENGTH] length, one_more = divmod(nulls, 8) if one_more: length += 1 fielddef = array('B', [0] * 32) fielddef[:11] = array('B', pack_str(b'_NULLFLAGS')) fielddef[11] = 0x30 fielddef[12:16] = array('B', pack_long_int(start)) fielddef[16] = length fielddef[17] = 0 fielddef[18] = BINARY | SYSTEM fieldblock.extend(fielddef) meta.fields.append('_NULLFLAGS') nullflags = ( _NULLFLAG, # type start, # start length, # length start + length, # end 0, # decimals BINARY | SYSTEM, # flags none, # class none, # empty ) meta['_NULLFLAGS'] = nullflags # header.fields = to_bytes(fieldblock) header.fields = fieldblock meta.user_fields = FieldnameList([f for f in meta.fields if not meta[f][FLAGS] & SYSTEM]) meta.user_field_count = len(meta.user_fields) Record._create_blank_data(meta) def _check_memo_integrity(self): """ checks memo file for problems """ raise NotImplementedError("_check_memo_integrity must be implemented by subclass") def _initialize_fields(self): """ builds the FieldList of names, types, and descriptions from the disk file """ raise NotImplementedError("_initialize_fields must be implemented by subclass") def _field_layout(self, i): """ Returns field information Name Type(Length[, Decimals]) """ name = self._meta.fields[i] fielddef = self._meta[name] type = FieldType(fielddef[TYPE]) length = fielddef[LENGTH] decimals = fielddef[DECIMALS] set_flags = fielddef[FLAGS] flags = [] if type in (GENERAL, PICTURE): printable_flags = NULLABLE, SYSTEM else: printable_flags = BINARY, NULLABLE, SYSTEM for flg in printable_flags: if flg & set_flags == flg: flags.append(FieldFlag(flg)) set_flags &= 255 ^ flg if flags: flags = ' ' + ' '.join(f.text for f in flags) else: flags = '' if type in self._fixed_types: description = "%s %s%s" % (name, type.symbol, flags) elif type in self._numeric_types: description = "%s %s(%d,%d)%s" % (name, type.symbol, length, decimals, flags) else: description = "%s %s(%d)%s" % (name, type.symbol, length, flags) return description def _list_fields(self, specs, sep=','): """ standardizes field specs """ if specs is None: specs = self.field_names elif isinstance(specs, basestring): specs = specs.strip(sep).split(sep) else: specs = list(specs) specs = [s.strip() for s in specs] for i, s in enumerate(specs): if isinstance(s, bytes): specs[i] = s.decode(dbf.input_decoding) return FieldnameList(specs) def _nav_check(self): """ Raises `DbfError` if table is closed """ if self._meta.status == CLOSED: raise DbfError('table %s is closed' % self.filename) @staticmethod def _pack_date(date): """ Returns a group of three bytes, in integer form, of the date """ return array('B', [date.year - 1900, date.month, date.day]) @staticmethod def _unpack_date(bytestr): """ Returns a Date() of the packed three-byte date passed in """ year, month, day = struct.unpack(' %s' % (self.filename, value)) for index in range(len(self))[value]: record = self._table[index] sequence.append(record) return sequence else: raise TypeError('type <%s> not valid for indexing' % type(value)) def __init__(self, filename, field_specs=None, memo_size=128, ignore_memos=False, codepage=None, default_data_types=None, field_data_types=None, # e.g. 'name':str, 'age':float dbf_type=None, on_disk=True, unicode_errors='strict' ): """ open/create dbf file filename should include path if needed field_specs can be either a ;-delimited string or a list of strings memo_size is always 512 for db3 memos ignore_memos is useful if the memo file is missing or corrupt read_only will load records into memory, then close the disk file keep_memos will also load any memo fields into memory meta_only will ignore all records, keeping only basic table information codepage will override whatever is set in the table itself """ if not on_disk: if field_specs is None: raise DbfError("field list must be specified for memory tables") self._indexen = self._Indexen() self._meta = meta = self._MetaData() meta.max_fields = self._max_fields meta.max_records = self._max_records meta.table = weakref.ref(self) meta.filename = filename meta.fields = [] meta.user_fields = FieldnameList() meta.user_field_count = 0 meta.fieldtypes = fieldtypes = self._field_types meta.fixed_types = self._fixed_types meta.variable_types = self._variable_types meta.character_types = self._character_types meta.currency_types = self._currency_types meta.decimal_types = self._decimal_types meta.numeric_types = self._numeric_types meta.memo_types = self._memo_types meta.ignorememos = meta.original_ignorememos = ignore_memos meta.memo_size = memo_size meta.input_decoder = codecs.getdecoder(dbf.input_decoding) # from ascii to unicode meta.output_encoder = codecs.getencoder(dbf.input_decoding) # and back to ascii meta.unicode_errors = unicode_errors meta.header = header = self._TableHeader(self._dbfTableHeader, self._pack_date, self._unpack_date) header.extra = self._dbfTableHeaderExtra if default_data_types is None: default_data_types = dict() elif default_data_types == 'enhanced': default_data_types = { 'C' : Char, 'L' : Logical, 'D' : Date, } if self._versionabbr in ('vfp', 'db4'): default_data_types['T'] = DateTime self._meta._default_data_types = default_data_types if field_data_types is None: field_data_types = dict() field_data_types = dict((k.upper(), v) for k, v in field_data_types.items()) self._meta._field_data_types = field_data_types for field, types in default_data_types.items(): field = FieldType(field) if not isinstance(types, tuple): types = (types, ) for result_name, result_type in ezip(('Class', 'Empty', 'Null'), types): fieldtypes[field][result_name] = result_type if not on_disk: self._table = [] meta.location = IN_MEMORY meta.memoname = filename meta.header.data else: base, ext = os.path.splitext(filename) search_name = None if ext == '.': # use filename without the '.' search_name = search_memo = base matches = glob(search_name) elif ext.lower() == '.dbf': # use filename as-is matches = glob(filename) search_memo = base else: meta.filename = filename + '.dbf' search_name = filename + '.[Db][Bb][Ff]' search_memo = filename matches = glob(search_name) if not matches: meta.filename = filename search_name = filename matches = glob(search_name) if len(matches) == 1: meta.filename = matches[0] elif matches: raise DbfError("please specify exactly which of %r you want" % (matches, )) case = [('l','u')[c.isupper()] for c in meta.filename[-4:]] meta.memoname = base + ''.join([c if case[i] == 'l' else c.upper() for i, c in enumerate(self._memoext)]) if not os.path.exists(meta.memoname): # look for other case variations template = ''.join('[%s%s]' % (c, c.upper()) for c in self._memoext[1:]) matches = glob('%s.%s' % (search_memo, template)) if len(matches) == 1: meta.memoname = matches[0] elif len(matches) > 1: raise DbfError("too many possible memo files: %s" % ', '.join(matches)) meta.location = ON_DISK if codepage is not None: header.codepage(codepage) cp, sd, ld = _codepage_lookup(codepage) self._meta.codepage = sd self._meta.decoder, self._meta.encoder = unicode_error_handler(codecs.getdecoder(sd), codecs.getencoder(sd), unicode_errors) if field_specs: meta.status = READ_WRITE if meta.location == ON_DISK: meta.dfd = open(meta.filename, 'w+b') meta.newmemofile = True if codepage is None: header.codepage(default_codepage) cp, sd, ld = _codepage_lookup(header.codepage()) self._meta.codepage = sd self._meta.decoder, self._meta.encoder = unicode_error_handler(codecs.getdecoder(sd), codecs.getencoder(sd), unicode_errors) self.add_fields(field_specs) else: meta.status = READ_ONLY try: dfd = meta.dfd = open(meta.filename, 'rb') except IOError: e = sys.exc_info()[1] raise DbfError(unicode(e)).from_exc(None) dfd.seek(0) try: meta.header = header = self._TableHeader(dfd.read(32), self._pack_date, self._unpack_date) if not header.version in self._supported_tables: dfd.close() raise DbfError( "%s does not support %s [%x]" % (self._version, version_map.get(header.version, 'Unknown: %s' % header.version), header.version)) if codepage is None: cp, sd, ld = _codepage_lookup(header.codepage()) self._meta.codepage = sd self._meta.decoder, self._meta.encoder = unicode_error_handler(codecs.getdecoder(sd), codecs.getencoder(sd), unicode_errors) fieldblock = array('B', dfd.read(header.start - 32)) for i in range(len(fieldblock) // 32 + 1): fieldend = i * 32 if fieldblock[fieldend] == CR: break else: raise BadDataError("corrupt field structure in header") if len(fieldblock[:fieldend]) % 32 != 0: raise BadDataError("corrupt field structure in header") old_length = header.data[10:12] header.fields = fieldblock[:fieldend] header.data = header.data[:10] + old_length + header.data[12:] # restore original for testing header.extra = fieldblock[fieldend + 1:] # skip trailing \r self._initialize_fields() self._check_memo_integrity() dfd.seek(0) except DbfError: dfd.close() raise for field in meta.fields: field_type = meta[field][TYPE] default_field_type = ( fieldtypes[field_type]['Class'], fieldtypes[field_type]['Empty'], ) specific_field_type = field_data_types.get(field) if specific_field_type is not None and not isinstance(specific_field_type, tuple): specific_field_type = (specific_field_type, ) classes = [] for result_name, result_type in ezip( ('class', 'empty'), specific_field_type or default_field_type, ): classes.append(result_type) meta[field] = meta[field][:Field.CLASS] + tuple(classes) + meta[field][Field.NUL:] self.close() def __iter__(self): """ iterates over the table's records """ return Iter(self) def __len__(self): """ returns number of records in table """ return self._meta.header.record_count def __new__(cls, filename, field_specs=None, memo_size=128, ignore_memos=False, codepage=None, default_data_types=None, field_data_types=None, # e.g. 'name':str, 'age':float dbf_type=None, on_disk=True, unicode_errors='strict', ): if dbf_type is None and isinstance(filename, Table): return filename if field_specs and dbf_type is None: dbf_type = dbf.default_type if dbf_type is not None: dbf_type = dbf_type.lower() table = table_types.get(dbf_type) if table is None: raise DbfError("Unknown table type: %s" % dbf_type) return object.__new__(table) else: possibles = guess_table_type(filename) if len(possibles) == 1: return object.__new__(possibles[0][2]) else: for type, desc, cls in possibles: if type == dbf.default_type: return object.__new__(cls) else: types = ', '.join(["%s" % item[1] for item in possibles]) abbrs = '[' + ' | '.join(["%s" % item[0] for item in possibles]) + ']' raise DbfError("Table could be any of %s. Please specify %s when opening" % (types, abbrs)) if py_ver < (3, 0): def __nonzero__(self): """ True if table has any records """ return self._meta.header.record_count != 0 else: def __bool__(self): """ True if table has any records """ return self._meta.header.record_count != 0 def __repr__(self): return __name__ + ".Table(%r, status=%r)" % (self._meta.filename, self._meta.status) def __str__(self): encoder = self._meta.encoder status = self._meta.status version = version_map.get(self._meta.header.version) if version is not None: version = self._version else: version = 'unknown - ' + hex(self._meta.header.version) str = """ Table: %s Type: %s Listed Codepage: %s Used Codepage: %s Status: %s Last updated: %s Record count: %d Field count: %d Record length: %d """ % (self.filename, version, code_pages[self._meta.header.codepage()][1], self.codepage, status, self.last_update, len(self), self.field_count, self.record_length) str += "\n --Fields--\n" for i in range(len(self.field_names)): str += "%11d) %s\n" % (i, (self._field_layout(i).encode(dbf.input_decoding, errors='backslashreplace'))) return str @property def codepage(self): """ code page used for text translation """ return CodePage(self._meta.codepage) @codepage.setter def codepage(self, codepage): if not isinstance(codepage, CodePage): raise TypeError("codepage should be a CodePage, not a %r" % type(codepage)) meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to change codepage' % meta.filename) cp, sd, ld = _codepage_lookup(meta.header.codepage(codepage.code)) meta.decoder, meta.encoder = unicode_error_handler(codecs.getdecoder(sd), codecs.getencoder(sd), meta.unicode_errors) meta.codepage = sd self._update_disk(headeronly=True) @property def field_count(self): """ the number of user fields in the table """ return self._meta.user_field_count @property def field_names(self): """ a list of the user fields in the table """ return self._meta.user_fields[:] @property def filename(self): """ table's file name, including path (if specified on open) """ return self._meta.filename @property def last_update(self): """ date of last update """ return self._meta.header.update @property def memoname(self): """ table's memo name (if path included in filename on open) """ return self._meta.memoname @property def record_length(self): """ number of bytes in a record (including deleted flag and null field size """ return self._meta.header.record_length @property def supported_tables(self): """ allowable table types """ return self._supported_tables @property def status(self): """ CLOSED, READ_ONLY, or READ_WRITE """ return self._meta.status @property def version(self): """ returns the dbf type of the table """ return self._version def add_fields(self, field_specs): """ adds field(s) to the table layout; format is Name Type(Length,Decimals)[; Name Type(Length,Decimals)[...]] backup table is created with _backup appended to name then zaps table, recreates current structure, and copies records back from the backup """ # for python 2, convert field_specs from bytes to unicode if necessary if py_ver < (3, 0): if isinstance(field_specs, bytes): if dbf.input_decoding is None: raise DbfError('field specifications must be unicode, not bytes (or set dbf.input_decoding)') field_specs = field_specs.decode(dbf.input_decoding) if isinstance(field_specs, list) and any(isinstance(t, bytes) for t in field_specs): if dbf.input_decoding is None: raise DbfError('field specifications must be unicode, not bytes (or set dbf.input_decoding)') fs = [] for text in field_specs: if isinstance(text, bytes): text = text.decode(dbf.input_decoding) fs.append(text) field_specs = fs else: if ( isinstance(field_specs, bytes) or isinstance(field_specs, list) and any(isinstance(t, bytes) for t in field_specs) ): raise DbfError('field specifications must be unicode, not bytes') meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to add fields (%s)' % (meta.filename, meta.status)) fields = self.structure() original_fields = len(fields) fields += self._list_fields(field_specs, sep=u';') null_fields = any(['NULL' in f.upper() for f in fields]) if (len(fields) + null_fields) > meta.max_fields: raise DbfError( "Adding %d more field%s would exceed the limit of %d" % (len(fields), ('','s')[len(fields)==1], meta.max_fields) ) old_table = None if self: old_table = self.create_backup() self.zap() if meta.mfd is not None and not meta.ignorememos: meta.mfd.close() meta.mfd = None meta.memo = None if not meta.ignorememos: meta.newmemofile = True offset = 1 for name in meta.fields: del meta[name] meta.fields[:] = [] meta.blankrecord = None null_index = -1 for field_seq, field in enumerate(fields): if not field: continue field = field.upper() pieces = field.split() name = pieces.pop(0) try: if '(' in pieces[0]: loc = pieces[0].index('(') pieces.insert(0, pieces[0][:loc]) pieces[1] = pieces[1][loc:] format = FieldType(pieces.pop(0)) if pieces and '(' in pieces[0]: for i, p in enumerate(pieces): if ')' in p: pieces[0:i+1] = [''.join(pieces[0:i+1])] break except IndexError: raise FieldSpecError('bad field spec: %r' % field) if field_seq >= original_fields and (name[0] == '_' or name[0].isdigit() or not name.replace('_', '').isalnum()): # find appropriate line to point warning to for i, frame in enumerate(reversed(traceback.extract_stack()), start=1): if frame[0] == __file__ and frame[2] == 'resize_field': # ignore break elif frame[0] != __file__ or frame[2] not in ('__init__','add_fields'): warnings.warn('%r invalid: field names should start with a letter, and only contain letters, digits, and _' % name, FieldNameWarning, stacklevel=i) break if name in meta.fields: raise DbfError("Field '%s' already exists" % name) field_type = format if len(name) > 10: raise FieldSpecError("Maximum field name length is 10. '%s' is %d characters long." % (name, len(name))) if not field_type in meta.fieldtypes.keys(): raise FieldSpecError("Unknown field type: %s" % field_type) init = self._meta.fieldtypes[field_type]['Init'] flags = self._meta.fieldtypes[field_type]['flags'] try: length, decimals, flags = init(pieces, flags) except FieldSpecError: exc = sys.exc_info()[1] raise FieldSpecError(exc.message + ' (%s:%s)' % (meta.filename, name)).from_exc(None) nullable = flags & NULLABLE if nullable: null_index += 1 start = offset end = offset + length offset = end meta.fields.append(name) cls = meta.fieldtypes[field_type]['Class'] empty = meta.fieldtypes[field_type]['Empty'] meta[name] = ( field_type, start, length, end, decimals, flags, cls, empty, nullable and null_index, ) self._build_header_fields() self._update_disk() if old_table is not None: old_table.open() for record in old_table: self.append(scatter(record)) old_table.close() def allow_nulls(self, fields): """ set fields to allow null values -- NO LONGER ALLOWED, MUST BE SET AT TABLE CREATION """ raise DbfError('fields can only be set to allow NULLs at table creation') def append(self, data=b'', drop=False, multiple=1): """ adds blank records, and fills fields with dict/tuple values if present """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to append records' % meta.filename) if not self.field_count: raise DbfError("No fields defined, cannot append") dictdata = False tupledata = False header = meta.header kamikaze = b'' if header.record_count == meta.max_records: raise DbfError("table %r is full; unable to add any more records" % self) if isinstance(data, (Record, RecordTemplate)): if data._meta.record_sig[0] == self._meta.record_sig[0]: kamikaze = data._data else: if isinstance(data, dict): dictdata = dict((ensure_unicode(k).upper(), v) for k, v in data.items()) data = b'' elif isinstance(data, tuple): if len(data) > self.field_count: raise DbfError("incoming data has too many values") tupledata = data data = b'' elif data: raise TypeError("data to append must be a tuple, dict, record, or template; not a %r" % type(data)) newrecord = Record(recnum=header.record_count, layout=meta, kamikaze=kamikaze) if kamikaze and meta.memofields: newrecord._start_flux() for field in meta.memofields: newrecord[field] = data[field] newrecord._commit_flux() self._table.append(newrecord) header.record_count += 1 if not kamikaze: try: if dictdata: gather(newrecord, dictdata, drop=drop) elif tupledata: newrecord._start_flux() for index, item in enumerate(tupledata): item = ensure_unicode(item) newrecord[index] = item newrecord._commit_flux() elif data: newrecord._start_flux() data_fields = field_names(data) my_fields = self.field_names for field in data_fields: if field not in my_fields: if not drop: raise DbfError("field %r not in table %r" % (field, self)) else: newrecord[field] = data[field] newrecord._commit_flux() except Exception: self._table.pop() # discard failed record header.record_count = header.record_count - 1 self._update_disk() raise multiple -= 1 if multiple: data = newrecord._data single = header.record_count total = single + multiple while single < total: multi_record = Record(single, meta, kamikaze=data) multi_record._start_flux() self._table.append(multi_record) for field in meta.memofields: multi_record[field] = newrecord[field] single += 1 multi_record._commit_flux() header.record_count = total # += multiple newrecord = multi_record self._update_disk(headeronly=True) def close(self): """ closes disk files, flushing record data to disk ensures table data is available if keep_table ensures memo data is available if keep_memos """ if self._meta.location == ON_DISK and self._meta.status != CLOSED: self._table.flush() if self._meta.mfd is not None: self._meta.mfd.close() self._meta.mfd = None if self._meta.dfd is not None: self._meta.dfd.close() self._meta.dfd = None self._meta.status = CLOSED def create_backup(self, new_name=None, on_disk=None): """ creates a backup table """ meta = self._meta already_open = meta.status != CLOSED if not already_open: self.open() if on_disk is None: on_disk = meta.location if not on_disk and new_name is None: new_name = self.filename + '_backup' if new_name is None: upper = self.filename.isupper() directory, filename = os.path.split(self.filename) name, ext = os.path.splitext(filename) extra = ('_backup', '_BACKUP')[upper] new_name = os.path.join(temp_dir or directory, name + extra + ext) memo_size = meta.memo_size bkup = Table( new_name, self.structure(), memo_size, codepage=self.codepage.name, dbf_type=self._versionabbr, on_disk=on_disk, ) # use same encoder/decoder as current table, which may have been overridden bkup._meta.encoder = self._meta.encoder bkup._meta.decoder = self._meta.decoder bkup.open(READ_WRITE) for record in self: bkup.append(record) bkup.close() self.backup = new_name if not already_open: self.close() return bkup def create_index(self, key): """ creates an in-memory index using the function key """ meta = self._meta if meta.status == CLOSED: raise DbfError('%s is closed' % meta.filename) return Index(self, key) def create_template(self, record=None, defaults=None): """ returns a record template that can be used like a record """ return RecordTemplate(self._meta, original_record=record, defaults=defaults) def delete_fields(self, doomed): """ removes field(s) from the table creates backup files with _backup appended to the file name, then modifies current structure """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to delete fields' % meta.filename) doomed = self._list_fields(doomed) for victim in doomed: if victim not in meta.user_fields: raise DbfError("field %s not in table -- delete aborted" % victim) old_table = None if self: old_table = self.create_backup() self.zap() if meta.mfd is not None and not meta.ignorememos: meta.mfd.close() meta.mfd = None meta.memo = None if not meta.ignorememos: meta.newmemofile = True if '_NULLFLAGS' in meta.fields: doomed.append('_NULLFLAGS') for victim in doomed: layout = meta[victim] meta.fields.pop(meta.fields.index(victim)) start = layout[START] end = layout[END] for field in meta.fields: if meta[field][START] == end: specs = list(meta[field]) end = specs[END] #self._meta[field][END] specs[START] = start #self._meta[field][START] = start specs[END] = start + specs[LENGTH] #self._meta[field][END] = start + self._meta[field][LENGTH] start = specs[END] #self._meta[field][END] meta[field] = tuple(specs) self._build_header_fields() self._update_disk() for name in list(meta): if name not in meta.fields: del meta[name] if old_table is not None: old_table.open() for record in old_table: self.append(scatter(record), drop=True) old_table.close() def disallow_nulls(self, fields): """ set fields to not allow null values """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to change field types' % meta.filename) fields = self._list_fields(fields) missing = set(fields) - set(self.field_names) if missing: raise FieldMissingError(', '.join(missing)) old_table = None if self: old_table = self.create_backup() self.zap() if meta.mfd is not None and not meta.ignorememos: meta.mfd.close() meta.mfd = None meta.memo = None if not meta.ignorememos: meta.newmemofile = True for field in fields: specs = list(meta[field]) specs[FLAGS] &= 0xff ^ NULLABLE meta[field] = tuple(specs) meta.blankrecord = None self._build_header_fields() self._update_disk() if old_table is not None: old_table.open() for record in old_table: self.append(scatter(record)) old_table.close() def field_info(self, field): """ returns (field type, size, dec, class) of field """ field = field.upper() if field in self.field_names: field = self._meta[field] return FieldInfo(field[TYPE], field[LENGTH], field[DECIMALS], field[CLASS]) raise FieldMissingError("%s is not a field in %s" % (field, self.filename)) def index(self, record, start=None, stop=None): """ returns the index of record between start and stop start and stop default to the first and last record """ if not isinstance(record, (Record, RecordTemplate, dict, tuple)): raise TypeError("x should be a record, template, dict, or tuple, not %r" % type(record)) meta = self._meta if meta.status == CLOSED: raise DbfError('%s is closed' % meta.filename) if start is None: start = 0 if stop is None: stop = len(self) for i in range(start, stop): if record == (self[i]): return i else: raise NotFoundError("dbf.Table.index(x): x not in table", data=record) def new(self, filename, field_specs=None, memo_size=None, ignore_memos=None, codepage=None, default_data_types=None, field_data_types=None, on_disk=True): """ returns a new table of the same type """ if field_specs is None: field_specs = self.structure() if on_disk: path, name = os.path.split(filename) if path == "": filename = os.path.join(os.path.split(self.filename)[0], filename) elif name == "": filename = os.path.join(path, os.path.split(self.filename)[1]) if memo_size is None: memo_size = self._meta.memo_size if ignore_memos is None: ignore_memos = self._meta.ignorememos if codepage is None: codepage = self._meta.header.codepage() if default_data_types is None: default_data_types = self._meta._default_data_types if field_data_types is None: field_data_types = self._meta._field_data_types return Table(filename, field_specs, memo_size, ignore_memos, codepage, default_data_types, field_data_types, dbf_type=self._versionabbr, on_disk=on_disk) def nullable_field(self, field): """ returns True if field allows Nulls """ field = field.upper() if field not in self.field_names: raise FieldMissingError(field) return bool(self._meta[field][FLAGS] & NULLABLE) def open(self, mode=READ_ONLY): """ (re)opens disk table, (re)initializes data structures """ if mode not in (READ_WRITE, READ_ONLY): raise DbfError("mode for open must be dbf.READ_ONLY or dbf.READ_WRITE, not %r" % mode) meta = self._meta if meta.status == mode: return self # no-op meta.status = mode if meta.location == IN_MEMORY: return self if '_table' in dir(self): del self._table mode = ('rb', 'r+b')[meta.status is READ_WRITE] dfd = meta.dfd = open(meta.filename, mode) dfd.seek(0) header = meta.header = self._TableHeader(dfd.read(32), self._pack_date, self._unpack_date) if not header.version in self._supported_tables: dfd.close() dfd = None raise DbfError("Unsupported dbf type: %s [%x]" % (version_map.get(header.version, 'Unknown: %s' % header.version), header.version)) fieldblock = array('B', dfd.read(header.start - 32)) for i in range(len(fieldblock) // 32 + 1): fieldend = i * 32 if fieldblock[fieldend] == CR: break else: raise BadDataError("corrupt field structure in header") if len(fieldblock[:fieldend]) % 32 != 0: raise BadDataError("corrupt field structure in header") header.fields = fieldblock[:fieldend] header.extra = fieldblock[fieldend + 1:] # skip trailing \r self._meta.ignorememos = self._meta.original_ignorememos self._initialize_fields() self._check_memo_integrity() self._index = -1 dfd.seek(0) return self def pack(self): """ physically removes all deleted records """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to pack records' % meta.filename) for dbfindex in self._indexen: dbfindex._clear() newtable = [] index = 0 for record in self._table: if is_deleted(record): record._recnum = -1 else: record._recnum = index newtable.append(record) index += 1 if meta.location == ON_DISK: self._table.clear() else: self._table[:] = [] for record in newtable: self._table.append(record) self._pack_count += 1 self._meta.header.record_count = index self._index = -1 self._update_disk() self.reindex() def query(self, criteria): """ criteria is a string that will be converted into a function that returns a List of all matching records """ meta = self._meta if meta.status == CLOSED: raise DbfError('%s is closed' % meta.filename) return pqlc(self, criteria) def reindex(self): """ reprocess all indices for this table """ meta = self._meta if meta.status == CLOSED: raise DbfError('%s is closed' % meta.filename) for dbfindex in self._indexen: dbfindex._reindex() def rename_field(self, oldname, newname): """ renames an existing field """ oldname = oldname.upper() newname = newname.upper() meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to change field names' % meta.filename) if self: self.create_backup() if not oldname in self._meta.user_fields: raise FieldMissingError("field --%s-- does not exist -- cannot rename it." % oldname) if newname[0] == '_' or newname[0].isdigit() or not newname.replace('_', '').isalnum(): raise FieldSpecError("field names cannot start with _ or digits, and can only contain the _, letters, and digits") if newname in self._meta.fields: raise DbfError("field --%s-- already exists" % newname) if len(newname) > 10: raise FieldSpecError("maximum field name length is 10. '%s' is %d characters long." % (newname, len(newname))) self._meta[newname] = self._meta[oldname] self._meta.fields[self._meta.fields.index(oldname)] = newname self._build_header_fields() self._update_disk(headeronly=True) def resize_field(self, chosen, new_size): """ resizes field (C only at this time) creates backup file, then modifies current structure """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to change field size' % meta.filename) if self._versionabbr == 'clp': max_size = 65535 else: max_size = 255 if not 0 < new_size <= max_size: raise DbfError("new_size must be between 1 and 255 (use delete_fields to remove a field)") chosen = self._list_fields(chosen) for candidate in chosen: if candidate not in self._meta.user_fields: raise DbfError("field %s not in table -- resize aborted" % candidate) elif self.field_info(candidate).field_type != FieldType.CHAR: raise DbfError("field %s is not Character -- resize aborted" % candidate) old_table = None if self: old_table = self.create_backup() self.zap() if meta.mfd is not None and not meta.ignorememos: meta.mfd.close() meta.mfd = None meta.memo = None if not meta.ignorememos: meta.newmemofile = True struct = self.structure() meta.user_fields[:] = [] new_struct = [] for field_spec in struct: name, spec = field_spec.split(' ', 1) if name in chosen: spec = "C(%d)" % new_size new_struct.append(' '.join([name, spec])) self.add_fields(';'.join(new_struct)) if old_table is not None: old_table.open() for record in old_table: self.append(scatter(record), drop=True) old_table.close() def structure(self, fields=None): """ return field specification list suitable for creating same table layout fields should be a list of fields or None for all fields in table """ field_specs = FieldnameList([]) fields = self._list_fields(fields) try: for name in fields: field_specs.append(self._field_layout(self.field_names.index(name))) except ValueError: raise DbfError("field %s does not exist" % name).from_exc(None) return field_specs def zap(self): """ removes all records from table -- this cannot be undone! """ meta = self._meta if meta.status != READ_WRITE: raise DbfError('%s not in read/write mode, unable to zap table' % meta.filename) if meta.location == IN_MEMORY: self._table[:] = [] else: self._table.clear() if meta.memo: meta.memo._zap() meta.header.record_count = 0 self._index = -1 self._update_disk() class Db3Table(Table): """ Provides an interface for working with dBase III tables. """ _version = 'dBase III Plus' _versionabbr = 'db3' @MutableDefault def _field_types(): return { CHAR: { 'Type':'Character', 'Retrieve':retrieve_character, 'Update':update_character, 'Blank':lambda x: b' ' * x, 'Init':add_character, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, DATE: { 'Type':'Date', 'Retrieve':retrieve_date, 'Update':update_date, 'Blank':lambda x: b' ', 'Init':add_date, 'Class':datetime.date, 'Empty':none, 'flags':tuple(), }, NUMERIC: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, LOGICAL: { 'Type':'Logical', 'Retrieve':retrieve_logical, 'Update':update_logical, 'Blank':lambda x: b'?', 'Init':add_logical, 'Class':bool, 'Empty':none, 'flags':tuple(), }, MEMO: { 'Type':'Memo', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ', 'Init':add_memo, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, FLOAT: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, TIMESTAMP: { 'Type':'TimeStamp', 'Retrieve':retrieve_clp_timestamp, 'Update':update_clp_timestamp, 'Blank':lambda x: b'\x00' * 8, 'Init':add_clp_timestamp, 'Class':datetime.datetime, 'Empty':none, 'flags':tuple(), }, } _memoext = '.dbt' _memoClass = _Db3Memo _yesMemoMask = 0x80 _noMemoMask = 0x7f _binary_types = (TIMESTAMP, ) _character_types = (CHAR, MEMO) _currency_types = tuple() _date_types = (DATE, ) _datetime_types = (TIMESTAMP, ) _decimal_types = (NUMERIC, FLOAT) _fixed_types = (DATE, LOGICAL, MEMO, TIMESTAMP) _logical_types = (LOGICAL, ) _memo_types = (MEMO, ) _numeric_types = (NUMERIC, FLOAT) _variable_types = (CHAR, NUMERIC, FLOAT) _dbfTableHeader = array('B', [0] * 32) _dbfTableHeader[0] = 3 # version - dBase III w/o memo's _dbfTableHeader[8:10] = array('B', pack_short_int(33)) _dbfTableHeader[10] = 1 # record length -- one for delete flag _dbfTableHeader[29] = 3 # code page -- 437 US-MS DOS # _dbfTableHeader = to_bytes(_dbfTableHeader) _dbfTableHeaderExtra = b'' _supported_tables = (0x03, 0x83) def _check_memo_integrity(self): """ dBase III and Clipper """ if not self._meta.ignorememos: memo_fields = False for field in self._meta.fields: if self._meta[field][TYPE] in self._memo_types: memo_fields = True break if memo_fields and self._meta.header.version != 0x83: self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: memo fields exist, header declares no memos") elif memo_fields and not os.path.exists(self._meta.memoname): self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: memo fields exist without memo file") if memo_fields: try: self._meta.memo = self._memoClass(self._meta) except Exception: exc = sys.exc_info()[1] self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: unable to use memo file (%s)" % exc.args[-1]).from_exc(None) def _initialize_fields(self): """ builds the FieldList of names, types, and descriptions """ old_fields = defaultdict(dict) meta = self._meta for name in meta.fields: old_fields[name]['type'] = meta[name][TYPE] old_fields[name]['empty'] = meta[name][EMPTY] old_fields[name]['class'] = meta[name][CLASS] meta.fields[:] = [] offset = 1 fieldsdef = meta.header.fields if len(fieldsdef) % 32 != 0: raise BadDataError("field definition block corrupt: %d bytes in size" % len(fieldsdef)) if len(fieldsdef) // 32 != meta.header.field_count: raise BadDataError("Header shows %d fields, but field definition block has %d fields" % (meta.header.field_count, len(fieldsdef) // 32)) total_length = meta.header.record_length # null fields not allowed in db3 tables null_index = None for i in range(meta.header.field_count): fieldblock = fieldsdef[i*32:(i+1)*32] name = self._meta.decoder(unpack_str(fieldblock[:11]))[0].upper() type = fieldblock[11] if not type in meta.fieldtypes: raise BadDataError("Unknown field type: %s" % type) start = offset length = fieldblock[16] offset += length end = start + length decimals = fieldblock[17] flags = fieldblock[18] if name in meta.fields: raise BadDataError('Duplicate field name found: %s' % name) meta.fields.append(name) if name in old_fields and old_fields[name]['type'] == type: cls = old_fields[name]['class'] empty = old_fields[name]['empty'] else: cls = meta.fieldtypes[type]['Class'] empty = meta.fieldtypes[type]['Empty'] meta[name] = ( type, start, length, end, decimals, flags, cls, empty, null_index ) if offset != total_length: raise BadDataError("Header shows record length of %d, but calculated record length is %d" % (total_length, offset)) meta.user_fields = FieldnameList([f for f in meta.fields if not meta[f][FLAGS] & SYSTEM]) meta.user_field_count = len(meta.user_fields) Record._create_blank_data(meta) class ClpTable(Db3Table): """ Provides an interface for working with Clipper tables. """ _version = 'Clipper 5' _versionabbr = 'clp' @MutableDefault def _field_types(): return { CHAR: { 'Type':'Character', 'Retrieve':retrieve_character, 'Update':update_character, 'Blank':lambda x: b' ' * x, 'Init':add_clp_character, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, DATE: { 'Type':'Date', 'Retrieve':retrieve_date, 'Update':update_date, 'Blank':lambda x: b' ', 'Init':add_date, 'Class':datetime.date, 'Empty':none, 'flags':tuple(), }, NUMERIC: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, LOGICAL: { 'Type':'Logical', 'Retrieve':retrieve_logical, 'Update':update_logical, 'Blank':lambda x: b'?', 'Init':add_logical, 'Class':bool, 'Empty':none, 'flags':tuple(), }, MEMO: { 'Type':'Memo', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ', 'Init':add_memo, 'Class':unicode, 'Empty':unicode, 'flags':tuple(), }, FLOAT: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_numeric, 'Class':'default', 'Empty':none, 'flags':tuple(), }, TIMESTAMP: { 'Type':'TimeStamp', 'Retrieve':retrieve_clp_timestamp, 'Update':update_clp_timestamp, 'Blank':lambda x: b'\x00' * 8, 'Init':add_clp_timestamp, 'Class':datetime.datetime, 'Empty':none, 'flags':tuple(), }, } _memoext = '.dbt' _memoClass = _Db3Memo _yesMemoMask = 0x80 _noMemoMask = 0x7f _binary_types = () _character_types = (CHAR, MEMO) _currency_types = tuple() _date_types = (DATE, TIMESTAMP) _datetime_types = (TIMESTAMP, ) _decimal_types = (NUMERIC, FLOAT) _fixed_types = (DATE, LOGICAL, MEMO, TIMESTAMP) _logical_types = (LOGICAL, ) _memo_types = (MEMO, ) _numeric_types = (NUMERIC, FLOAT) _variable_types = (CHAR, NUMERIC, FLOAT) _dbfTableHeader = array('B', [0] * 32) _dbfTableHeader[0] = 3 # version - dBase III w/o memo's _dbfTableHeader[8:10] = array('B', pack_short_int(33)) _dbfTableHeader[10] = 1 # record length -- one for delete flag _dbfTableHeader[29] = 3 # code page -- 437 US-MS DOS # _dbfTableHeader = to_bytes(_dbfTableHeader) _dbfTableHeaderExtra = b'' _supported_tables = (0x03, 0x83) class _TableHeader(Table._TableHeader): """ represents the data block that defines a tables type and layout """ @property def fields(self): "field block structure" fieldblock = self._data[32:] for i in range(len(fieldblock)//32+1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") return fieldblock[:cr] # return to_bytes(fieldblock[:cr]) @fields.setter def fields(self, block): fieldblock = self._data[32:] for i in range(len(fieldblock)//32+1): cr = i * 32 if fieldblock[cr] == CR: break else: raise BadDataError("corrupt field structure") cr += 32 # convert to indexing main structure fieldlen = len(block) if fieldlen % 32 != 0: raise BadDataError("fields structure corrupt: %d is not a multiple of 32" % fieldlen) self._data[32:cr] = array('B', block) # fields self._data[8:10] = array('B', pack_short_int(len(self._data))) # start fieldlen = fieldlen // 32 recordlen = 1 # deleted flag for i in range(fieldlen): recordlen += block[i*32+16] if block[i*32+11] == CHAR: recordlen += block[i*32+17] * 256 self._data[10:12] = array('B', pack_short_int(recordlen)) def _build_header_fields(self): """ constructs fieldblock for disk table """ fieldblock = array('B', b'') memo = False nulls = 0 meta = self._meta header = meta.header header.version = header.version & self._noMemoMask meta.fields = [f for f in meta.fields if f != '_NULLFLAGS'] total_length = 1 # delete flag for field in meta.fields: layout = meta[field] if meta.fields.count(field) > 1: raise BadDataError("corrupted field structure (noticed in _build_header_fields)") fielddef = array('B', [0] * 32) fielddef[:11] = array('B', pack_str(meta.encoder(field)[0])) fielddef[11] = layout[TYPE] fielddef[12:16] = array('B', pack_long_int(layout[START])) total_length += layout[LENGTH] if layout[TYPE] == CHAR: # long character field fielddef[16] = layout[LENGTH] % 256 fielddef[17] = layout[LENGTH] // 256 else: fielddef[16] = layout[LENGTH] fielddef[17] = layout[DECIMALS] fielddef[18] = layout[FLAGS] fieldblock.extend(fielddef) if layout[TYPE] in meta.memo_types: memo = True if layout[FLAGS] & NULLABLE: nulls += 1 if memo: header.version = header.version | self._yesMemoMask if meta.memo is None: meta.memo = self._memoClass(meta) else: if os.path.exists(meta.memoname): if meta.mfd is not None: meta.mfd.close() os.remove(meta.memoname) meta.memo = None if nulls: start = layout[START] + layout[LENGTH] length, one_more = divmod(nulls, 8) if one_more: length += 1 fielddef = array('B', [0] * 32) fielddef[:11] = array('B', pack_str(b'_NullFlags')) fielddef[11] = FieldType._NULLFLAG fielddef[12:16] = array('B', pack_long_int(start)) fielddef[16] = length fielddef[17] = 0 fielddef[18] = BINARY | SYSTEM fieldblock.extend(fielddef) meta.fields.append('_NULLFLAGS') nullflags = ( _NULLFLAG, # type start, # start length, # length start + length, # end 0, # decimals BINARY | SYSTEM, # flags none, # class none, # empty ) meta['_NULLFLAGS'] = nullflags header.fields = fieldblock # header.fields = to_bytes(fieldblock) header.record_length = total_length meta.user_fields = FieldnameList([f for f in meta.fields if not meta[f][FLAGS] & SYSTEM]) meta.user_field_count = len(meta.user_fields) Record._create_blank_data(meta) def _initialize_fields(self): """ builds the FieldList of names, types, and descriptions """ meta = self._meta old_fields = defaultdict(dict) for name in meta.fields: old_fields[name]['type'] = meta[name][TYPE] old_fields[name]['empty'] = meta[name][EMPTY] old_fields[name]['class'] = meta[name][CLASS] fieldsdef = meta.header.fields if len(fieldsdef) % 32 != 0: raise BadDataError( "field definition block corrupt: %d bytes in size" % len(fieldsdef)) if len(fieldsdef) // 32 != meta.header.field_count: raise BadDataError( "Header shows %d fields, but field definition block has %d fields" % (meta.header.field_count, len(fieldsdef) // 32)) total_length = meta.header.record_length nulls_found = False starters = set() # keep track of starting values in case header is poorly created for starter in ('header', 'offset'): meta.fields[:] = [] offset = 1 for i in range(meta.header.field_count): fieldblock = fieldsdef[i*32:(i+1)*32] name = self._meta.decoder(unpack_str(fieldblock[:11]))[0] type = fieldblock[11] if not type in meta.fieldtypes: raise BadDataError("Unknown field type: %s" % type) if starter == 'header': start = unpack_long_int(fieldblock[12:16]) if start in starters: # poor header break starters.add(start) else: start = offset length = fieldblock[16] decimals = fieldblock[17] if type == CHAR: length += decimals * 256 end = start + length offset += length flags = fieldblock[18] null = flags & NULLABLE if null: nulls_found = True if name in meta.fields: raise BadDataError('Duplicate field name found: %s' % name) meta.fields.append(name) if name in old_fields and old_fields[name]['type'] == type: cls = old_fields[name]['class'] empty = old_fields[name]['empty'] else: cls = meta.fieldtypes[type]['Class'] empty = meta.fieldtypes[type]['Empty'] meta[name] = ( type, start, length, end, decimals, flags, cls, empty, null ) else: # made it through all the fields break if offset != total_length: raise BadDataError( "Header shows record length of %d, but calculated record length is %d" % (total_length, offset)) if nulls_found: nullable_fields = [f for f in meta if meta[f][NUL]] nullable_fields.sort(key=lambda f: f[START]) for i, f in enumerate(nullable_fields): meta[f][NUL] = i null_bytes, plus_one = divmod(len(nullable_fields), 8) if plus_one: null_bytes += 1 meta.empty_null = array('B', b'\x00' * null_bytes) meta.user_fields = FieldnameList([f for f in meta.fields if not meta[f][FLAGS] & SYSTEM]) meta.user_field_count = len(meta.user_fields) Record._create_blank_data(meta) class FpTable(Table): """ Provides an interface for working with FoxPro 2 tables """ _version = 'Foxpro' _versionabbr = 'fp' @MutableDefault def _field_types(): return { CHAR: { 'Type':'Character', 'Retrieve':retrieve_character, 'Update':update_character, 'Blank':lambda x: b' ' * x, 'Init':add_vfp_character, 'Class':unicode, 'Empty':unicode, 'flags':('BINARY', 'NOCPTRANS', 'NULL', ), }, FLOAT: { 'Type':'Float', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_vfp_numeric, 'Class':'default', 'Empty':none, 'flags':('NULL', ), }, NUMERIC: { 'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':lambda x: b' ' * x, 'Init':add_vfp_numeric, 'Class':'default', 'Empty':none, 'flags':('NULL', ), }, LOGICAL: { 'Type':'Logical', 'Retrieve':retrieve_logical, 'Update':update_logical, 'Blank':lambda x: b'?', 'Init':add_logical, 'Class':bool, 'Empty':none, 'flags':('NULL', ), }, DATE: { 'Type':'Date', 'Retrieve':retrieve_date, 'Update':update_date, 'Blank':lambda x: b' ', 'Init':add_date, 'Class':datetime.date, 'Empty':none, 'flags':('NULL', ), }, MEMO: { 'Type':'Memo', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ', 'Init':add_memo, 'Class':unicode, 'Empty':unicode, 'flags':('BINARY', 'NOCPTRANS', 'NULL', ), }, GENERAL: { 'Type':'General', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ', 'Init':add_binary_memo, 'Class':bytes, 'Empty':bytes, 'flags':('NULL', ), }, PICTURE: { 'Type':'Picture', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ', 'Init':add_binary_memo, 'Class':bytes, 'Empty':bytes, 'flags':('NULL', ), }, _NULLFLAG: { 'Type':'_NullFlags', 'Retrieve':unsupported_type, 'Update':unsupported_type, 'Blank':lambda x: b'\x00' * x, 'Init':None, 'Class':none, 'Empty':none, 'flags':('BINARY', 'SYSTEM', ), } } _memoext = '.fpt' _memoClass = _VfpMemo _yesMemoMask = 0xf5 # 1111 0101 _noMemoMask = 0x02 # 0000 0010 _binary_types = (GENERAL, MEMO, PICTURE) _character_types = (CHAR, DATE, FLOAT, LOGICAL, MEMO, NUMERIC) # field representing character data _currency_types = tuple() _date_types = (DATE, ) _datetime_types = tuple() _fixed_types = (DATE, GENERAL, LOGICAL, MEMO, PICTURE) _logical_types = (LOGICAL, ) _memo_types = (GENERAL, MEMO, PICTURE) _numeric_types = (FLOAT, NUMERIC) _text_types = (CHAR, MEMO) _variable_types = (CHAR, FLOAT, NUMERIC) _supported_tables = (0x02, 0x03, 0xf5) _dbfTableHeader = array('B', [0] * 32) _dbfTableHeader[0] = 0x02 # version - Foxbase _dbfTableHeader[8:10] = array('B', pack_short_int(33 + 263)) _dbfTableHeader[10] = 1 # record length -- one for delete flag _dbfTableHeader[29] = 3 # code page -- 437 US-MS DOS # _dbfTableHeader = to_bytes(_dbfTableHeader) _dbfTableHeaderExtra = b'\x00' * 263 def _check_memo_integrity(self): if not self._meta.ignorememos: memo_fields = False for field in self._meta.fields: if self._meta[field][TYPE] in self._memo_types: memo_fields = True break if memo_fields and not os.path.exists(self._meta.memoname): self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: memo fields exist without memo file") elif not memo_fields and os.path.exists(self._meta.memoname): self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: no memo fields exist but memo file does") if memo_fields: try: self._meta.memo = self._memoClass(self._meta) except Exception: exc = sys.exc_info()[1] self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: unable to use memo file (%s)" % exc.args[-1]).from_exc(None) def _initialize_fields(self): """ builds the FieldList of names, types, and descriptions """ meta = self._meta old_fields = defaultdict(dict) for name in meta.fields: old_fields[name]['type'] = meta[name][TYPE] old_fields[name]['class'] = meta[name][CLASS] old_fields[name]['empty'] = meta[name][EMPTY] meta.fields[:] = [] offset = 1 fieldsdef = meta.header.fields if len(fieldsdef) % 32 != 0: raise BadDataError("field definition block corrupt: %d bytes in size" % len(fieldsdef)) if len(fieldsdef) // 32 != meta.header.field_count: raise BadDataError( "Header shows %d fields, but field definition block has %d fields" % (meta.header.field_count, len(fieldsdef) // 32)) total_length = meta.header.record_length for i in range(meta.header.field_count): fieldblock = fieldsdef[i*32:(i+1)*32] name = self._meta.decoder(unpack_str(fieldblock[:11]))[0] type = fieldblock[11] if not type in meta.fieldtypes: raise BadDataError("Unknown field type: %s" % type) start = offset length = fieldblock[16] offset += length end = start + length decimals = fieldblock[17] flags = fieldblock[18] if name in meta.fields: raise BadDataError('Duplicate field name found: %s' % name) meta.fields.append(name) if name in old_fields and old_fields[name]['type'] == type: cls = old_fields[name]['class'] empty = old_fields[name]['empty'] else: cls = meta.fieldtypes[type]['Class'] empty = meta.fieldtypes[type]['Empty'] meta[name] = ( type, start, length, end, decimals, flags, cls, empty, 0, ) if offset != total_length: raise BadDataError("Header shows record length of %d, but calculated record length is %d" % (total_length, offset)) meta.user_fields = FieldnameList([f for f in meta.fields if not meta[f][FLAGS] & SYSTEM]) meta.user_field_count = len(meta.user_fields) Record._create_blank_data(meta) @staticmethod def _pack_date(date): """ Returns a group of three bytes, in integer form, of the date """ # return "%c%c%c" % (date.year - 2000, date.month, date.day) return array('B', [date.year - 2000, date.month, date.day]) @staticmethod def _unpack_date(bytestr): """ Returns a Date() of the packed three-byte date passed in """ year, month, day = struct.unpack(' 0 else: def __bool__(self): self._still_valid_check() return len(self) > 0 def __radd__(self, other): self._still_valid_check() key = self.key if isinstance(other, (Table, list)): other = self.__class__(other, key=key) if isinstance(other, self.__class__): other._still_valid_check() result = other.__class__() result._set = other._set.copy() result._list[:] = other._list[:] result._tables = {} result._tables.update(self._tables) result.key = other.key if key is other.key: # same key? just compare key values for item in self._list: result._maybe_add(item) else: # different keys, use this list's key on other's records for rec in self: result._maybe_add((source_table(rec), recno(rec), key(rec))) return result return NotImplemented def __repr__(self): self._still_valid_check() if self._desc: return "%s(key=(%s), desc=%s)" % (self.__class__, self.key.__doc__, self._desc) else: return "%s(key=(%s))" % (self.__class__, self.key.__doc__) def __rsub__(self, other): self._still_valid_check() key = self.key if isinstance(other, (Table, list)): other = self.__class__(other, key=key) if isinstance(other, self.__class__): other._still_valid_check() result = other.__class__() result._list[:] = other._list[:] result._set = other._set.copy() result._tables = {} result._tables.update(other._tables) result.key = key lost = set() if key is other.key: for item in self._list: if item[2] in result._list: result._set.remove(item[2]) lost.add(item) else: for rec in self: value = key(rec) if value in result._set: result._set.remove(value) lost.add((source_table(rec), recno(rec), value)) result._list = [item for item in result._list if item not in lost] lost = set(result._tables.keys()) for table, _1, _2 in result._list: if table in result._tables: lost.remove(table) if not lost: break for table in lost: del result._tables[table] return result return NotImplemented def __sub__(self, other): self._still_valid_check() key = self.key if isinstance(other, (Table, list)): other = self.__class__(other, key=key) if isinstance(other, self.__class__): other._still_valid_check() result = self.__class__() result._list[:] = self._list[:] result._set = self._set.copy() result._tables = {} result._tables.update(self._tables) result.key = key lost = set() if key is other.key: for item in other._list: if item[2] in result._set: result._set.remove(item[2]) lost.add(item[2]) else: for rec in other: value = key(rec) if value in result._set: result._set.remove(value) lost.add(value) result._list = [item for item in result._list if item[2] not in lost] lost = set(result._tables.keys()) for table, _1, _2 in result._list: if table in result._tables: lost.remove(table) if not lost: break for table in lost: del result._tables[table] return result return NotImplemented def _maybe_add(self, item): self._still_valid_check() table, recno, key = item self._tables[table] = table._pack_count # TODO: check that _pack_count is the same if already in table if key not in self._set: self._set.add(key) self._list.append(item) def _get_record(self, table=None, rec_no=None, value=None): if table is rec_no is None: table, rec_no, value = self._list[self._index] return table[rec_no] def _purge(self, record, old_record_number, offset): partial = source_table(record), old_record_number records = sorted(self._list, key=lambda item: (item[0], item[1])) for item in records: if partial == item[:2]: found = True break elif partial[0] is item[0] and partial[1] < item[1]: found = False break else: found = False if found: self._list.pop(self._list.index(item)) self._set.remove(item[2]) start = records.index(item) + found for item in records[start:]: if item[0] is not partial[0]: # into other table's records break i = self._list.index(item) self._set.remove(item[2]) item = item[0], (item[1] - offset), item[2] self._list[i] = item self._set.add(item[2]) return found def _still_valid_check(self): for table, last_pack in self._tables.items(): if last_pack != getattr(table, '_pack_count'): raise DbfError("table has been packed; list is invalid") _nav_check = _still_valid_check def append(self, record): self._still_valid_check() self._maybe_add((source_table(record), recno(record), self.key(record))) def clear(self): self._list = [] self._set = set() self._index = -1 self._tables.clear() def extend(self, records): self._still_valid_check() key = self.key if isinstance(records, self.__class__): if key is records.key: # same key? just compare key values for item in records._list: self._maybe_add(item) else: # different keys, use this list's key on other's records for rec in records: value = key(rec) self._maybe_add((source_table(rec), recno(rec), value)) else: for rec in records: value = key(rec) self._maybe_add((source_table(rec), recno(rec), value)) def index(self, record, start=None, stop=None): """ returns the index of record between start and stop start and stop default to the first and last record """ if not isinstance(record, (Record, RecordTemplate, dict, tuple)): raise TypeError("x should be a record, template, dict, or tuple, not %r" % type(record)) self._still_valid_check() if start is None: start = 0 if stop is None: stop = len(self) for i in range(start, stop): if record == (self[i]): return i else: raise NotFoundError("dbf.List.index(x): x not in List", data=record) def insert(self, i, record): self._still_valid_check() item = source_table(record), recno(record), self.key(record) if item not in self._set: self._set.add(item[2]) self._list.insert(i, item) def key(self, record): """ table_name, record_number """ self._still_valid_check() return source_table(record), recno(record) def pop(self, index=None): self._still_valid_check() if index is None: table, recno, value = self._list.pop() else: table, recno, value = self._list.pop(index) self._set.remove(value) return self._get_record(table, recno, value) def query(self, criteria): """ criteria is a callback that returns a truthy value for matching record """ return pqlc(self, criteria) def remove(self, data): self._still_valid_check() if not isinstance(data, (Record, RecordTemplate, dict, tuple)): raise TypeError("%r(%r) is not a record, template, tuple, nor dict" % (type(data), data)) index = self.index(data) record = self[index] item = source_table(record), recno(record), self.key(record) self._list.remove(item) self._set.remove(item[2]) def reverse(self): self._still_valid_check() return self._list.reverse() def sort(self, key=None, reverse=False): self._still_valid_check() if key is None: return self._list.sort(reverse=reverse) return self._list.sort(key=lambda item: key(item[0][item[1]]), reverse=reverse) # table meta table_types = { 'db3' : Db3Table, 'clp' : ClpTable, 'fp' : FpTable, 'vfp' : VfpTable, } # https://social.msdn.microsoft.com/Forums/en-US/315c582a-651f-4a2e-b51c-92aadef8bddf/opening-vfp-tables-with-fox26-dos?forum=visualfoxprogeneral # File type: # 0x02 FoxBASE # 0x03 FoxBASE+/Dbase III plus, no memo # 0x30 Visual FoxPro # 0x31 Visual FoxPro, autoincrement enabled # # 0x32 Visual FoxPro, Varchar, Varbinary, or Blob-enabled # 0x43 dBASE IV SQL table files, no memo # 0x63 dBASE IV SQL system files, no memo # 0x83 FoxBASE+/dBASE III PLUS, with memo # 0x8B dBASE IV with memo # 0xCB dBASE IV SQL table files, with memo # 0xF5 FoxPro 2.x (or earlier) with memo # 0xFB FoxBASE # version_map = { 0x02 : 'FoxBASE', 0x03 : 'dBase III Plus', 0x04 : 'dBase IV', 0x05 : 'dBase V', 0x30 : 'Visual FoxPro', 0x31 : 'Visual FoxPro (auto increment field)', 0x32 : 'Visual FoxPro (VarChar, VarBinary, or BLOB enabled)', 0x43 : 'dBase IV SQL table files', 0x63 : 'dBase IV SQL system files', 0x83 : 'dBase III Plus w/memos', 0x8b : 'dBase IV w/memos', 0x8e : 'dBase IV w/SQL table', 0xf5 : 'FoxPro w/memos'} code_pages = { 0x00 : ('ascii', "plain ol' ascii"), 0x01 : ('cp437', 'U.S. MS-DOS'), 0x02 : ('cp850', 'International MS-DOS'), 0x03 : ('cp1252', 'Windows ANSI'), 0x04 : ('mac_roman', 'Standard Macintosh'), 0x08 : ('cp865', 'Danish OEM'), 0x09 : ('cp437', 'Dutch OEM'), 0x0A : ('cp850', 'Dutch OEM (secondary)'), 0x0B : ('cp437', 'Finnish OEM'), 0x0D : ('cp437', 'French OEM'), 0x0E : ('cp850', 'French OEM (secondary)'), 0x0F : ('cp437', 'German OEM'), 0x10 : ('cp850', 'German OEM (secondary)'), 0x11 : ('cp437', 'Italian OEM'), 0x12 : ('cp850', 'Italian OEM (secondary)'), 0x13 : ('cp932', 'Japanese Shift-JIS'), 0x14 : ('cp850', 'Spanish OEM (secondary)'), 0x15 : ('cp437', 'Swedish OEM'), 0x16 : ('cp850', 'Swedish OEM (secondary)'), 0x17 : ('cp865', 'Norwegian OEM'), 0x18 : ('cp437', 'Spanish OEM'), 0x19 : ('cp437', 'English OEM (Britain)'), 0x1A : ('cp850', 'English OEM (Britain) (secondary)'), 0x1B : ('cp437', 'English OEM (U.S.)'), 0x1C : ('cp863', 'French OEM (Canada)'), 0x1D : ('cp850', 'French OEM (secondary)'), 0x1F : ('cp852', 'Czech OEM'), 0x22 : ('cp852', 'Hungarian OEM'), 0x23 : ('cp852', 'Polish OEM'), 0x24 : ('cp860', 'Portugese OEM'), 0x25 : ('cp850', 'Potugese OEM (secondary)'), 0x26 : ('cp866', 'Russian OEM'), 0x37 : ('cp850', 'English OEM (U.S.) (secondary)'), 0x40 : ('cp852', 'Romanian OEM'), 0x4D : ('cp936', 'Chinese GBK (PRC)'), 0x4E : ('cp949', 'Korean (ANSI/OEM)'), 0x4F : ('cp950', 'Chinese Big 5 (Taiwan)'), 0x50 : ('cp874', 'Thai (ANSI/OEM)'), 0x57 : ('cp1252', 'ANSI'), 0x58 : ('cp1252', 'Western European ANSI'), 0x59 : ('cp1252', 'Spanish ANSI'), 0x64 : ('cp852', 'Eastern European MS-DOS'), 0x65 : ('cp866', 'Russian MS-DOS'), 0x66 : ('cp865', 'Nordic MS-DOS'), 0x67 : ('cp861', 'Icelandic MS-DOS'), 0x68 : (None, 'Kamenicky (Czech) MS-DOS'), 0x69 : (None, 'Mazovia (Polish) MS-DOS'), 0x6a : ('cp737', 'Greek MS-DOS (437G)'), 0x6b : ('cp857', 'Turkish MS-DOS'), 0x78 : ('cp950', 'Traditional Chinese (Hong Kong SAR, Taiwan) Windows'), 0x79 : ('cp949', 'Korean Windows'), 0x7a : ('cp936', 'Chinese Simplified (PRC, Singapore) Windows'), 0x7b : ('cp932', 'Japanese Windows'), 0x7c : ('cp874', 'Thai Windows'), 0x7d : ('cp1255', 'Hebrew Windows'), 0x7e : ('cp1256', 'Arabic Windows'), 0x87 : ('cp852', 'Slovenian OEM'), 0xc8 : ('cp1250', 'Eastern European Windows'), 0xc9 : ('cp1251', 'Russian Windows'), 0xca : ('cp1254', 'Turkish Windows'), 0xcb : ('cp1253', 'Greek Windows'), 0xcc : ('cp1257', 'Baltic Windows'), 0x96 : ('mac_cyrillic', 'Russian Macintosh'), 0x97 : ('mac_latin2', 'Macintosh EE'), 0x98 : ('mac_greek', 'Greek Macintosh'), 0xf0 : ('utf8', '8-bit unicode'), } dbf.default_codepage = default_codepage = code_pages.get(0x00)[0] def _codepage_lookup(cp): if cp not in code_pages: for code_page in sorted(code_pages.keys()): sd, ld = code_pages[code_page] if cp == sd or cp == ld: if sd is None: raise DbfError("Unsupported codepage: %s" % ld) cp = code_page break else: raise DbfError("Unsupported codepage: %s" % cp) sd, ld = code_pages[cp] return cp, sd, ld # miscellany class _Db4Table(Table): """ under development """ version = 'dBase IV w/memos (non-functional)' _versionabbr = 'db4' @MutableDefault def _field_types(): return { CHAR: {'Type':'Character', 'Retrieve':retrieve_character, 'Update':update_character, 'Blank':lambda x: b' ' * x, 'Init':add_vfp_character}, CURRENCY: {'Type':'Currency', 'Retrieve':retrieve_currency, 'Update':update_currency, 'Blank':Decimal, 'Init':add_vfp_currency}, DOUBLE: {'Type':'Double', 'Retrieve':retrieve_double, 'Update':update_double, 'Blank':float, 'Init':add_vfp_double}, FLOAT: {'Type':'Float', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':float, 'Init':add_vfp_numeric}, NUMERIC: {'Type':'Numeric', 'Retrieve':retrieve_numeric, 'Update':update_numeric, 'Blank':int, 'Init':add_vfp_numeric}, INTEGER: {'Type':'Integer', 'Retrieve':retrieve_integer, 'Update':update_integer, 'Blank':int, 'Init':add_vfp_integer}, LOGICAL: {'Type':'Logical', 'Retrieve':retrieve_logical, 'Update':update_logical, 'Blank':Logical, 'Init':add_logical}, DATE: {'Type':'Date', 'Retrieve':retrieve_date, 'Update':update_date, 'Blank':Date, 'Init':add_date}, DATETIME: {'Type':'DateTime', 'Retrieve':retrieve_vfp_datetime, 'Update':update_vfp_datetime, 'Blank':DateTime, 'Init':add_vfp_datetime}, MEMO: {'Type':'Memo', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ' * x, 'Init':add_memo}, GENERAL: {'Type':'General', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ' * x, 'Init':add_memo}, PICTURE: {'Type':'Picture', 'Retrieve':retrieve_memo, 'Update':update_memo, 'Blank':lambda x: b' ' * x, 'Init':add_memo}, _NULLFLAG: {'Type':'_NullFlags', 'Retrieve':unsupported_type, 'Update':unsupported_type, 'Blank':int, 'Init':None} } _memoext = '.dbt' _memotypes = ('G', 'M', 'P') _memoClass = _VfpMemo _yesMemoMask = 0x8b # 1000 1011 _noMemoMask = 0x04 # 0000 0100 _fixed_fields = ('B', 'D', 'G', 'I', 'L', 'M', 'P', 'T', 'Y') _variable_fields = ('C', 'F', 'N') _binary_fields = ('G', 'P') _character_fields = ('C', 'M') # field representing character data _decimal_fields = ('F', 'N') _numeric_fields = ('B', 'F', 'I', 'N', 'Y') _currency_fields = ('Y',) _supported_tables = (0x04, 0x8b) _dbfTableHeader = [0] * 32 _dbfTableHeader[0] = 0x8b # version - Foxpro 6 0011 0000 _dbfTableHeader[10] = 0x01 # record length -- one for delete flag _dbfTableHeader[29] = 0x03 # code page -- 437 US-MS DOS # _dbfTableHeader = bytes(_dbfTableHeader) _dbfTableHeaderExtra = b'' def _check_memo_integrity(self): """ dBase IV specific """ if self._meta.header.version == 0x8b: try: self._meta.memo = self._memoClass(self._meta) except: self._meta.dfd.close() self._meta.dfd = None raise if not self._meta.ignorememos: for field in self._meta.fields: if self._meta[field][TYPE] in self._memotypes: if self._meta.header.version != 0x8b: self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: memo fields exist, header declares no memos") elif not os.path.exists(self._meta.memoname): self._meta.dfd.close() self._meta.dfd = None raise BadDataError("Table structure corrupt: memo fields exist without memo file") break ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/test.py0000664000175000017500000071063214770560041013550 0ustar00ethanethanimport codecs import datetime import os import sys import unittest import tempfile import shutil import stat import warnings from unittest import skipIf, skipUnless, TestCase as unittest_TestCase py_ver = sys.version_info[:2] module = globals() from . import dbf from . import * from .constants import * try: import pytz except ImportError: pytz = None if py_ver < (3, 0): MISC = ''.join([chr(i) for i in range(256)]) PHOTO = ''.join(reversed([chr(i) for i in range(256)])) else: unicode = str xrange = range module.update(LatinByte.__members__) MISC = ''.join([chr(i) for i in range(256)]).encode('latin-1') PHOTO = ''.join(reversed([chr(i) for i in range(256)])).encode('latin-1') try: with warnings.catch_warnings(): warnings.warn('test if warning is an exception', DbfWarning, stacklevel=1) warnings_are_exceptions = False except DbfWarning: warnings_are_exceptions = True print("\nTesting dbf version %d.%02d.%03d on %s with Python %s\n" % ( dbf.version[:3] + (sys.platform, sys.version) )) class TestCase(unittest_TestCase): def __init__(self, *args, **kwds): regex = getattr(self, 'assertRaisesRegex', None) if regex is None: self.assertRaisesRegex = getattr(self, 'assertRaisesRegexp') super(TestCase, self).__init__(*args, **kwds) # Walker in Leaves -- by Scot Noel -- http://www.scienceandfantasyfiction.com/sciencefiction/Walker-in-Leaves/walker-in-leaves.htm words = """ Soft rains, given time, have rounded the angles of great towers. Generation after generation, wind borne seeds have brought down cities amid the gentle tangle of their roots. All statues of stone have been worn away. Still one statue, not of stone, holds its lines against the passing years. Sunlight, fading autumn light, warms the sculpture as best it can, almost penetrating to its dreaming core. The figure is that of a woman, once the fair sex of a species now untroubled and long unseen. Man sleeps the sleep of extinction. This one statue remains. Behind the grace of its ivory brow and gentle, unseeing eyes, the statue dreams. A susurrus of voices, a flutter of images, and the dream tumbles down through the long morning. Suspended. Floating on the stream that brings from the heart of time the wandering self. Maya for that is the statue s name-- is buoyed by the sensation, rising within the cage of consciousness, but asleep. She has been this way for months: the unmoving figure of a woman caught in mid stride across the glade. The warmth of sunlight on her face makes her wonder if she will ever wake again. Even at the end, there was no proper word for what Maya has become. Robot. Cybernetic Organism. Android. These are as appropriate to her condition as calling the stars campfires of the night sky and equally precise. It is enough to know that her motive energies are no longer sun and sustenance, and though Maya was once a living woman, a scientist, now she inhabits a form of ageless attraction. It is a form whose energies are flagging. With great determination, Maya moves toward wakefulness. Flex a finger. Move a hand. Think of the lemurs, their tongues reaching out in stroke after stroke for the drip of the honeyed thorns. Though there is little time left to save her charges, Maya s only choice is the patience of the trees. On the day her energies return, it is autumn of the year following the morning her sleep began. Maya opens her eyes. The woman, the frozen machine --that which is both-- moves once more. Two lemur cubs tumbling near the edge of the glade take notice. One rushes forward to touch Maya s knee and laugh. Maya reaches out with an arthritic hand, cold in its sculpted smoothness, but the lemur darts away. Leaves swirl about its retreat, making a crisp sound. The cub displays a playfulness Maya s fevered mind cannot match. The second cub rolls between her moss covered feet, laughing. The lemurs are her charges, and she is failing them. Still, it is good to be awake. Sugar maples and sumacs shoulder brilliant robes. In the low sun, their orange and purple hues startle the heart. Of course, Maya has no beating organ, no heart. Her life energies are transmitted from deep underground. Nor are the cubs truly lemurs, nor the sugar maples the trees of old. The names have carried for ten million seasons, but the species have changed. Once the lemurs inhabited an island off the southeast coast of a place called Africa. Now they are here, much changed, in the great forests of the northern climes. The young lemurs seem hale, and it speaks well for their consanguine fellows. But their true fate lies in the story of DNA, of a few strands in the matriarchal line, of a sequence code-named "hope." No doubt a once clever acronym, today Maya s clouded mind holds nothing of the ancient codes. She knows only that a poet once spoke of hope as "the thing with feathers that perches in the soul." Emily Dickinson. A strange name, and so unlike the agnomen of the lemurs. What has become of Giver-of-Corn? Having no reason to alarm the cubs, Maya moves with her hands high, so that any movement will be down as leaves fall. Though anxious about Giver-of-Corn, she ambles on to finish the mission begun six months earlier. Ahead, the shadow of a mound rises up beneath a great oak. A door awaits. Somewhere below the forest, the engine that gives her life weakens. Held in sway to its faltering beat her mind and body froze, sending her into an abyss of dreams. She has been striding toward that door for half a year, unknowing if she would ever wake again. Vines lose their toughened grip as the door responds to Maya s approach. Regretfully, a tree root snaps, but the door shudders to a halt before its whine of power can cross the glade. Suddenly, an opening has been made into the earth, and Maya steps lightly on its downward slope. Without breathing, she catches a scent of mold and of deep, uncirculated water. A flutter like the sound of wings echoes from the hollow. Her vision adjusts as she descends. In spots, lights attempt to greet her, but it is a basement she enters, flickering and ancient, where the footfalls of millipedes wear tracks in grime older than the forest above. After a long descent, she steps into water. How long ago was it that the floor was dry? The exactitude of such time, vast time, escapes her. Once this place sustained great scholars, scientists. Now sightless fish skip through broken walls, retreating as Maya wades their private corridors, finding with each step that she remembers the labyrinthine path to the heart of power. A heart that flutters like dark wings. And with it, she flutters too. The closer she comes to the vault in which the great engine is housed, the less hopeful she becomes. The vault housing the engine rests beneath a silvered arch. Its mirrored surface denies age, even as a new generation of snails rise up out of the dark pool, mounting first the dais of pearled stone left by their ancestors, the discarded shells of millions, then higher to where the overhang drips, layered in egg sacs bright as coral. Maya has no need to set the vault door in motion, to break the dance of the snails. The state of things tells her all she needs to know. There shall be no repairs, no rescue; the engine will die, and she with it. Still, it is impossible not to check. At her touch, a breath of firefly lights coalesces within the patient dampness of the room. They confirm. The heart is simply too tired to go on. Its last reserves wield processes of great weight and care, banking the fires of its blood, dimming the furnace into safe resolve. Perhaps a month or two in cooling, then the last fire kindled by man shall die. For the figure standing knee deep in water the issues are more immediate. The powers that allow her to live will be the first to fade. It is amazing, even now, that she remains cognizant. For a moment, Maya stands transfixed by her own reflection. The silvered arch holds it as moonlight does a ghost. She is a sculpted thing with shoulders of white marble. Lips of stone. A child s face. No, the grace of a woman resides in the features, as though eternity can neither deny the sage nor touch the youth. Demeter. The Earth Mother. Maya smiles at the Greek metaphor. She has never before thought of herself as divine, nor monumental. When the energies of the base are withdrawn entirely, she will become immobile. Once a goddess, then a statue to be worn away by endless time, the crumbling remnant of something the self has ceased to be. Maya trembles at the thought. The last conscious reserve of man will soon fade forever from the halls of time. As if hewn of irresolute marble, Maya begins to shake; were she still human there would be sobs; there would be tears to moisten her grief and add to the dark waters at her feet. In time, Maya breaks the spell. She sets aside her grief to work cold fingers over the dim firefly controls, giving what priorities remain to her survival. In response, the great engine promises little, but does what it can. While life remains, Maya determines to learn what she can of the lemurs, of their progress, and the fate of the matriarchal line. There will be time enough for dreams. Dreams. The one that tumbled down through the long morning comes to her and she pauses to consider it. There was a big table. Indistinct voices gathered around it, but the energy of a family gathering filled the space. The warmth of the room curled about her, perfumed by the smell of cooking. An ancient memory, from a time before the shedding of the flesh. Outside, children laughed. A hand took hers in its own, bringing her to a table filled with colorful dishes and surrounded by relatives and friends. Thanksgiving? They re calling me home, Maya thinks. If indeed her ancestors could reach across time and into a form not of the flesh, perhaps that was the meaning of the dream. I am the last human consciousness, and I am being called home. With a flutter, Maya is outside, and the trees all but bare of leaves. Something has happened. Weeks have passed and she struggles to take in her situation. This time she has neither dreamed nor stood immobile, but she has been active without memory. Her arms cradle a lemur, sheltering the pubescent female against the wind. They sit atop a ridge that separates the valley from the forest to the west, and Walker-in-Leaves has been gone too long. That much Maya remembers. The female lemur sighs. It is a rumbling, mournful noise, and she buries her head tighter against Maya. This is Giver-of-Corn, and Walker is her love. With her free hand, Maya works at a stiff knitting of pine boughs, the blanket which covers their legs. She pulls it up to better shelter Giver-of-Corn. Beside them, on a shell of bark, a sliver of fish has gone bad from inattention. They wait through the long afternoon, but Walker does not return. When it is warmest and Giver sleeps, Maya rises in stages, gently separating herself from the lemur. She covers her charge well. Soon it will snow. There are few memories after reaching the vault, only flashes, and that she has been active in a semi-consciousness state frightens Maya. She stumbles away, shaking, but there is no comfort to seek. She does not know if her diminished abilities endanger the lemurs, and considers locking herself beneath the earth. But the sun is warm, and for the moment every thought is a cloudless sky. Memories descend from the past like a lost tribe wandering for home. To the east lie once powerful lands and remembered sepulchers. The life of the gods, the pulse of kings, it has all vanished and gone. Maya thinks back to the days of man. There was no disaster at the end. Just time. Civilization did not fail, it succumbed to endless seasons. Each vast stretch of years drawn on by the next saw the conquest of earth and stars, then went on, unheeding, until man dwindled and his monuments frayed. To the west rise groves of oaks and grassland plains, beyond them, mountains that shrugged off civilization more easily than the rest. Where is the voyager in those leaves? A flash of time and Maya finds herself deep in the forests to the west. A lemur call escapes her throat, and suddenly she realizes she is searching for Walker-in-Leaves. The season is the same. Though the air is crisp, the trees are not yet unburdened of their color. "Walker!" she calls out. "Your love is dying. She mourns your absence." At the crest of a rise, Maya finds another like herself, but one long devoid of life. This sculpted form startles her at first. It has been almost wholly absorbed into the trunk of a great tree. The knee and calf of one leg escape the surrounding wood, as does a shoulder, the curve of a breast, a mournful face. A single hand reaches out from the tree toward the valley below. In the distance, Maya sees the remnants of a fallen orbiter. Its power nacelle lies buried deep beneath the river that cushioned its fall. Earth and water, which once heaved at the impact, have worn down impenetrable metals and grown a forest over forgotten technologies. Had the watcher in the tree come to see the fall, or to stand vigil over the corpse? Maya knows only that she must go on before the hills and the trees conspire to bury her. She moves on, continuing to call for Walker-in-Leaves. In the night, a coyote finally answers Maya, its frenetic howls awakening responses from many cousins, hunting packs holding court up and down the valley. Giver-of-Corn holds the spark of her generation. It is not much. A gene here and there, a deep manipulation of the flesh. The consciousness that was man is not easy to engender. Far easier to make an eye than a mind to see. Along a path of endless complication, today Giver-of-Corn mourns the absence of her mate. That Giver may die of such stubborn love before passing on her genes forces Maya deeper into the forest, using the last of her strength to call endlessly into the night. Maya is dreaming. It s Thanksgiving, but the table is cold. The chairs are empty, and no one answers her call. As she walks from room to room, the lights dim and it begins to rain within the once familiar walls. When Maya opens her eyes, it is to see Giver-of-Corm sleeping beneath a blanket of pine boughs, the young lemur s bushy tail twitching to the rhythm of sorrowful dreams. Maya is awake once more, but unaware of how much time has passed, or why she decided to return. Her most frightening thought is that she may already have found Walker-in-Leaves, or what the coyotes left behind. Up from the valley, two older lemurs walk arm in arm, supporting one another along the rise. They bring with them a twig basket and a pouch made of hide. The former holds squash, its hollowed interior brimming with water, the latter a corn mash favored by the tribe. They are not without skills, these lemurs. Nor is language unknown to them. They have known Maya forever and treat her, not as a god, but as a force of nature. With a few brief howls, by clicks, chatters, and the sweeping gestures of their tails, the lemurs make clear their plea. Their words all but rhyme. Giver-of-Corn will not eat for them. Will she eat for Maya? Thus has the mission to found a new race come down to this: with her last strength, Maya shall spoon feed a grieving female. The thought strikes her as both funny and sad, while beyond her thoughts, the lemurs continue to chatter. Scouts have been sent, the elders assure Maya, brave sires skilled in tracking. They hope to find Walker before the winter snows. Their voices stir Giver, and she howls in petty anguish at her benefactors, then disappears beneath the blanket. The elders bow their heads and turn to go, oblivious of Maya s failures. Days pass upon the ridge in a thickness of clouds. Growing. Advancing. Dimmed by the mountainous billows above, the sun gives way to snow, and Maya watches Giver focus ever more intently on the line to the west. As the lemur s strength fails, her determination to await Walker s return seems to grow stronger still. Walker-in-Leaves holds a spark of his own. He alone ventures west after the harvest. He has done it before, always returning with a colored stone, a bit of metal, or a flower never before seen by the tribe. It is as if some mad vision compels him, for the journey s end brings a collection of smooth and colored booty to be arranged in a crescent beneath a small monolith Walker himself toiled to raise. Large stones and small, the lemur has broken two fingers of its left hand doing this. To Maya, it seems the ambition of butterflies and falling leaves, of no consequence beyond a motion in the sun. The only importance now is to keep the genes within Giver alive. Long ago, an ambition rose among the last generation of men, of what had once been men: to cultivate a new consciousness upon the Earth. Maya neither led nor knew the masters of the effort, but she was there when the first prosimians arrived, fresh from their land of orchids and baobabs. Men gathered lemurs and said to them "we shall make you men." Long years followed in the work of the genes, gentling the generations forward. Yet with each passing season, the cultivators grew fewer and their skills less true. So while the men died of age, or boredom, or despair, the lemurs prospered in their youth. To warm the starving lemur, Maya builds a fire. For this feat the tribe has little skill, nor do they know zero, nor that a lever can move the world. She holds Giver close and pulls the rough blanket of boughs about them both. All this time, Maya s thoughts remain clear, and the giving of comfort comforts her as well. The snow begins to cover the monument Walker-in-Leaves has built upon the ridge. As Maya stares on and on into the fire, watching it absorb the snow, watching the snow conquer the cold stones and the grasses already bowed under a cloak of white, she drifts into a flutter of reverie, a weakening of consciousness. The gate to the end is closing, and she shall never know never know. "I ll take it easy like, an stay around de house this winter," her father said. "There s carpenter work for me to do." Other voices joined in around a table upon which a vast meal had been set. Thanksgiving. At the call of their names, the children rushed in from outside, their laughter quick as sunlight, their jackets smelling of autumn and leaves. Her mother made them wash and bow their heads in prayer. Those already seated joined in. Grandmother passed the potatoes and called Maya her little kolache, rattling on in a series of endearments and concerns Maya s ear could not follow. Her mother passed on the sense of it and reminded Maya of the Czech for Thank you, Grandma. It s good to be home, she thinks at first, then: where is the walker in those leaves? A hand on which two fingers lay curled by the power of an old wound touches Maya. It shakes her, then gently moves her arms so that its owner can pull back the warm pine boughs hiding Giver-of Corn. Eyes first, then smile to tail, Giver opens herself to the returning wanderer. Walker-in-Leaves has returned, and the silence of their embrace brings the whole of the ridge alive in a glitter of sun-bright snow. Maya too comes awake, though this time neither word nor movement prevails entirely upon the fog of sleep. When the answering howls come to the ridge, those who follow help Maya to stand. She follows them back to the shelter of the valley, and though she stumbles, there is satisfaction in the hurried gait, in the growing pace of the many as they gather to celebrate the return of the one. Songs of rejoicing join the undisciplined and cacophonous barks of youth. Food is brought, from the deep stores, from the caves and their recesses. Someone heats fish over coals they have kept sheltered and going for months. The thought of this ingenuity heartens Maya. A delicacy of honeyed thorns is offered with great ceremony to Giver-of-Corn, and she tastes at last something beyond the bitterness of loss. Though Walker-in-Leaves hesitates to leave the side of his love, the others demand stories, persuading him to the center where he begins a cacophonous song of his own. Maya hopes to see what stones Walker has brought from the west this time, but though she tries to speak, the sounds are forgotten. The engine fades. The last flicker of man s fire is done, and with it the effort of her desires overcome her. She is gone. Around a table suited for the Queen of queens, a thousand and a thousand sit. Mother to daughter, side-by-side, generation after generation of lemurs share in the feast. Maya is there, hearing the excited voices and the stern warnings to prayer. To her left and her right, each daughter speaks freely. Then the rhythms change, rising along one side to the cadence of Shakespeare and falling along the other to howls the forest first knew. Unable to contain herself, Maya rises. She pushes on toward the head of a table she cannot see, beginning at last to run. What is the height her charges have reached? How far have they advanced? Lemur faces turn to laugh, their wide eyes joyous and amused. As the generations pass, she sees herself reflected in spectacles, hears the jangle of bracelets and burnished metal, watches matrons laugh behind scarves of silk. Then at last, someone with sculpted hands directs her outside, where the other children are at play in the leaves, now and forever. THE END""".split() # data numbers = [2,3,5,7,11,13,17,19,23,29,31,37,41,43,47,53,59,61,67,71,73,79,83,89,97,101,103,107,109,113,127,131,137,139,149,151,157,163,167,173,179,181,191,193,197,199,211,223,227,229,233,239,241,251,257,263,269,271,277,281,283,293,307,311,313,317,331,337,347,349,353,359,367,373,379,383,389,397,401,409,419,421,431,433,439,443,449,457,461,463,467,479,487,491,499,503,509,521,523,541] floats = [] last = 1 for number in numbers: floats.append(float(number ** 2 / last)) last = number def permutate(Xs, N): if N <= 0: yield [] return for x in Xs: for sub in permutate(Xs, N-1): result = [x]+sub # don't allow duplicates for item in result: if result.count(item) > 1: break else: yield result def combinate(Xs, N): """Generate combinations of N items from list Xs""" if N == 0: yield [] return for i in xrange(len(Xs)-N+1): for r in combinate(Xs[i+1:], N-1): yield [Xs[i]] + r def index(sequence): "returns integers 0 - len(sequence)" for i in xrange(len(sequence)): yield i # tests def active(rec): if is_deleted(rec): return DoNotIndex return dbf.recno(rec) def inactive(rec): if is_deleted(rec): return recno(rec) return DoNotIndex def unicodify(data): if isinstance(data, list): for i, item in enumerate(data): data[i] = unicode(item) return data elif isinstance(data, dict): new_data = {} for k, v in data.items(): new_data[unicode(k)] = v return new_data else: raise TypeError('unknown type: %r' % (data, )) class TestChar(TestCase): def test_exceptions(self): "exceptions" self.assertRaises(ValueError, Char, 7) self.assertRaises(ValueError, Char, [u'nope']) self.assertRaises(ValueError, Char, True) self.assertRaises(ValueError, Char, False) self.assertRaises(ValueError, Char, type) self.assertRaises(ValueError, Char, str) self.assertRaises(ValueError, Char, None) def test_bools_and_none(self): "booleans and None" empty = Char() self.assertFalse(bool(empty)) one = Char(u' ') self.assertFalse(bool(one)) actual = Char(u'1') self.assertTrue(bool(actual)) def test_equality(self): "equality" a1 = Char(u'a') a2 = u'a ' self.assertEqual(a1, a2) self.assertEqual(a2, a1) a3 = u'a ' a4 = Char(u'a ') self.assertEqual(a3, a4) self.assertEqual(a4, a3) def test_inequality(self): "inequality" a1 = Char(u'ab ') a2 = u'a b' self.assertNotEqual(a1, a2) self.assertNotEqual(a2, a1) a3 = u'ab ' a4 = Char(u'a b') self.assertNotEqual(a3, a4) self.assertNotEqual(a4, a3) def test_less_than(self): "less-than" a1 = Char(u'a') a2 = u'a ' self.assertFalse(a1 < a2) self.assertFalse(a2 < a1) a3 = u'a ' a4 = Char(u'a ') self.assertFalse(a3 < a4) self.assertFalse(a4 < a3) a5 = u'abcd' a6 = u'abce' self.assertTrue(a5 < a6) self.assertFalse(a6 < a5) def test_less_than_equal(self): "less-than or equal" a1 = Char(u'a') a2 = u'a ' self.assertTrue(a1 <= a2) self.assertTrue(a2 <= a1) a3 = u'a ' a4 = Char(u'a ') self.assertTrue(a3 <= a4) self.assertTrue(a4 <= a3) a5 = u'abcd' a6 = u'abce' self.assertTrue(a5 <= a6) self.assertFalse(a6 <= a5) def test_greater_than(self): "greater-than or equal" a1 = Char(u'a') a2 = u'a ' self.assertTrue(a1 >= a2) self.assertTrue(a2 >= a1) a3 = u'a ' a4 = Char(u'a ') self.assertTrue(a3 >= a4) self.assertTrue(a4 >= a3) a5 = u'abcd' a6 = u'abce' self.assertFalse(a5 >= a6) self.assertTrue(a6 >= a5) def test_greater_than_equal(self): "greater-than" a1 = Char(u'a') a2 = u'a ' self.assertFalse(a1 > a2) self.assertFalse(a2 > a1) a3 = u'a ' a4 = Char(u'a ') self.assertFalse(a3 > a4) self.assertFalse(a4 > a3) a5 = u'abcd' a6 = u'abce' self.assertFalse(a5 > a6) self.assertTrue(a6 > a5) class TestDateTime(TestCase): "Testing Date" def test_date_creation(self): "Date creation" self.assertEqual(Date(), NullDate) self.assertEqual(Date.fromymd(' '), NullDate) self.assertEqual(Date.fromymd('00000000'), NullDate) self.assertEqual(Date.fromordinal(0), NullDate) self.assertEqual(Date.today(), datetime.date.today()) self.assertEqual(Date.max, datetime.date.max) self.assertEqual(Date.min, datetime.date.min) self.assertEqual(Date(2018, 5, 21), datetime.date(2018, 5, 21)) self.assertEqual(Date.strptime('2018-01-01'), datetime.date(2018, 1, 1)) self.assertRaises(ValueError, Date.fromymd, '00000') self.assertRaises(ValueError, Date, 0, 0, 0) def test_date_compare(self): "Date comparisons" nodate1 = Date() nodate2 = Date() date1 = Date.fromordinal(1000) date2 = Date.fromordinal(2000) date3 = Date.fromordinal(3000) self.compareTimes(nodate1, nodate2, date1, date2, date3) def test_datetime_creation(self): "DateTime creation" self.assertEqual(DateTime(), NullDateTime) self.assertEqual(DateTime.fromordinal(0), NullDateTime) self.assertTrue(DateTime.today()) self.assertEqual(DateTime.max, datetime.datetime.max) self.assertEqual(DateTime.min, datetime.datetime.min) self.assertEqual(DateTime(2018, 5, 21, 19, 17, 16), datetime.datetime(2018, 5, 21, 19, 17 ,16)) self.assertEqual(DateTime.strptime('2018-01-01 19:17:16'), datetime.datetime(2018, 1, 1, 19, 17, 16)) def test_datetime_compare(self): "DateTime comparisons" nodatetime1 = DateTime() nodatetime2 = DateTime() datetime1 = DateTime.fromordinal(1000) datetime2 = DateTime.fromordinal(20000) datetime3 = DateTime.fromordinal(300000) self.compareTimes(nodatetime1, nodatetime2, datetime1, datetime2, datetime3) def test_datetime_replace(self): "DateTime replacements" datetime_target = DateTime(2001, 5, 31, 23, 59, 59, 999000) datetime1 = datetime.datetime(2001, 5, 31, 23, 59, 59, 999230) datetime2 = datetime.datetime(2001, 5, 31, 23, 59, 59, 999500) datetime3 = datetime.datetime(2001, 5, 31, 23, 59, 59, 999728) original_datetime = datetime.datetime for dt in (datetime1, datetime2, datetime3): class DateTimeNow(datetime.datetime): @classmethod def now(self): datetime.datetime = original_datetime return dt datetime.datetime = DateTimeNow result = DateTime.now() self.assertEqual(result, datetime_target, 'in: %r out: %r desired: %r' % (dt, result, datetime_target)) def test_time_creation(self): "Time creation" self.assertEqual(Time(), NullTime) self.assertEqual(Time.max, datetime.time.max) self.assertEqual(Time.min, datetime.time.min) self.assertEqual(Time(19, 17, 16), datetime.time(19, 17 ,16)) self.assertEqual(Time.strptime('19:17:16'), datetime.time(19, 17, 16)) def test_time_compare(self): "Time comparisons" notime1 = Time() notime2 = Time() time1 = Time.fromfloat(7.75) time2 = Time.fromfloat(9.5) time3 = Time.fromfloat(16.25) self.compareTimes(notime1, notime2, time1, time2, time3) @unittest.skipIf(pytz is None, 'pytz not installed') def test_datetime_tz(self): "DateTime with Time Zones" pst = pytz.timezone('America/Los_Angeles') mst = pytz.timezone('America/Boise') cst = pytz.timezone('America/Chicago') est = pytz.timezone('America/New_York') utc = pytz.timezone('UTC') # pdt = DateTime(2018, 5, 20, 5, 41, 33, tzinfo=pst) mdt = DateTime(2018, 5, 20, 6, 41, 33, tzinfo=mst) cdt = DateTime(2018, 5, 20, 7, 41, 33, tzinfo=cst) edt = DateTime(2018, 5, 20, 8, 41, 33, tzinfo=est) udt = DateTime(2018, 5, 20, 12, 41, 33, tzinfo=utc) self.assertTrue(pdt == mdt == cdt == edt == udt) # dup1 = DateTime.combine(pdt.date(), mdt.timetz()) dup2 = DateTime.combine(cdt.date(), Time(5, 41, 33, tzinfo=pst)) self.assertTrue(dup1 == dup2 == udt) # udt2 = DateTime(2018, 5, 20, 13, 41, 33, tzinfo=utc) mdt2 = mdt.replace(tzinfo=pst) self.assertTrue(mdt2 == udt2) # with self.assertRaisesRegex(ValueError, 'not naive datetime'): DateTime(pdt, tzinfo=mst) with self.assertRaisesRegex(ValueError, 'not naive datetime'): DateTime(datetime.datetime(2018, 5, 27, 15, 57, 11, tzinfo=pst), tzinfo=pst) with self.assertRaisesRegex(ValueError, 'not naive time'): Time(pdt.timetz(), tzinfo=mst) with self.assertRaisesRegex(ValueError, 'not naive time'): Time(datetime.time(15, 58, 59, tzinfo=mst), tzinfo=mst) # if py_ver < (3, 0): from xmlrpclib import Marshaller, loads else: from xmlrpc.client import Marshaller, loads self.assertEqual( udt.utctimetuple(), loads(Marshaller().dumps([pdt]), use_datetime=True)[0][0].utctimetuple(), ) # self.assertEqual( pdt, DateTime.combine(Date(2018, 5, 20), Time(5, 41, 33), tzinfo=pst), ) def test_arithmetic(self): "Date, DateTime, & Time Arithmetic" one_day = datetime.timedelta(1) a_day = Date(1970, 5, 20) self.assertEqual(a_day + one_day, Date(1970, 5, 21)) self.assertEqual(a_day - one_day, Date(1970, 5, 19)) self.assertEqual(datetime.date(1970, 5, 21) - a_day, one_day) a_time = Time(12) one_second = datetime.timedelta(0, 1, 0) self.assertEqual(a_time + one_second, Time(12, 0, 1)) self.assertEqual(a_time - one_second, Time(11, 59, 59)) self.assertEqual(datetime.time(12, 0, 1) - a_time, one_second) an_appt = DateTime(2012, 4, 15, 12, 30, 00) displacement = datetime.timedelta(1, 60*60*2+60*15) self.assertEqual(an_appt + displacement, DateTime(2012, 4, 16, 14, 45, 0)) self.assertEqual(an_appt - displacement, DateTime(2012, 4, 14, 10, 15, 0)) self.assertEqual(datetime.datetime(2012, 4, 16, 14, 45, 0) - an_appt, displacement) def test_none_compare(self): "comparisons to None" empty_date = Date() empty_time = Time() empty_datetime = DateTime() self.assertEqual(empty_date, None) self.assertEqual(empty_time, None) self.assertEqual(empty_datetime, None) def test_singletons(self): "singletons" empty_date = Date() empty_time = Time() empty_datetime = DateTime() self.assertTrue(empty_date is NullDate) self.assertTrue(empty_time is NullTime) self.assertTrue(empty_datetime is NullDateTime) def test_boolean_value(self): "boolean evaluation" empty_date = Date() empty_time = Time() empty_datetime = DateTime() self.assertEqual(bool(empty_date), False) self.assertEqual(bool(empty_time), False) self.assertEqual(bool(empty_datetime), False) actual_date = Date.today() actual_time = Time.now() actual_datetime = DateTime.now() self.assertEqual(bool(actual_date), True) self.assertEqual(bool(actual_time), True) self.assertEqual(bool(actual_datetime), True) def compareTimes(self, empty1, empty2, uno, dos, tres): self.assertTrue(empty1 is empty2) self.assertTrue(empty1 < uno, '%r is not less than %r' % (empty1, uno)) self.assertFalse(empty1 > uno, '%r is less than %r' % (empty1, uno)) self.assertTrue(uno > empty1, '%r is not greater than %r' % (empty1, uno)) self.assertFalse(uno < empty1, '%r is greater than %r' % (empty1, uno)) self.assertEqual(uno < dos, True) self.assertEqual(uno <= dos, True) self.assertEqual(dos <= dos, True) self.assertEqual(dos <= tres, True) self.assertEqual(dos < tres, True) self.assertEqual(tres <= tres, True) self.assertEqual(uno == uno, True) self.assertEqual(dos == dos, True) self.assertEqual(tres == tres, True) self.assertEqual(uno != dos, True) self.assertEqual(dos != tres, True) self.assertEqual(tres != uno, True) self.assertEqual(tres >= tres, True) self.assertEqual(tres > dos, True) self.assertEqual(dos >= dos, True) self.assertEqual(dos >= uno, True) self.assertEqual(dos > uno, True) self.assertEqual(uno >= uno, True) self.assertEqual(uno >= dos, False) self.assertEqual(uno >= tres, False) self.assertEqual(dos >= tres, False) self.assertEqual(tres <= dos, False) self.assertEqual(tres <= uno, False) self.assertEqual(tres < tres, False) self.assertEqual(tres < dos, False) self.assertEqual(tres < uno, False) self.assertEqual(dos < dos, False) self.assertEqual(dos < uno, False) self.assertEqual(uno < uno, False) self.assertEqual(uno == dos, False) self.assertEqual(uno == tres, False) self.assertEqual(dos == uno, False) self.assertEqual(dos == tres, False) self.assertEqual(tres == uno, False) self.assertEqual(tres == dos, False) self.assertEqual(uno != uno, False) self.assertEqual(dos != dos, False) self.assertEqual(tres != tres, False) class TestNull(TestCase): def test_all(self): NULL = Null = dbf.Null() self.assertTrue(NULL is dbf.Null()) self.assertTrue(NULL + 1 is Null) self.assertTrue(1 + NULL is Null) NULL += 4 self.assertTrue(NULL is Null) value = 5 value += NULL self.assertTrue(value is Null) self.assertTrue(NULL - 2 is Null) self.assertTrue(2 - NULL is Null) NULL -= 5 self.assertTrue(NULL is Null) value = 6 value -= NULL self.assertTrue(value is Null) self.assertTrue(NULL / 0 is Null) self.assertTrue(3 / NULL is Null) NULL /= 6 self.assertTrue(NULL is Null) value = 7 value /= NULL self.assertTrue(value is Null) self.assertTrue(NULL * -3 is Null) self.assertTrue(4 * NULL is Null) NULL *= 7 self.assertTrue(NULL is Null) value = 8 value *= NULL self.assertTrue(value is Null) self.assertTrue(NULL % 1 is Null) self.assertTrue(7 % NULL is Null) NULL %= 1 self.assertTrue(NULL is Null) value = 9 value %= NULL self.assertTrue(value is Null) self.assertTrue(NULL ** 2 is Null) self.assertTrue(4 ** NULL is Null) NULL **= 3 self.assertTrue(NULL is Null) value = 9 value **= NULL self.assertTrue(value is Null) self.assertTrue(NULL & 1 is Null) self.assertTrue(1 & NULL is Null) NULL &= 1 self.assertTrue(NULL is Null) value = 1 value &= NULL self.assertTrue(value is Null) self.assertTrue(NULL ^ 1 is Null) self.assertTrue(1 ^ NULL is Null) NULL ^= 1 self.assertTrue(NULL is Null) value = 1 value ^= NULL self.assertTrue(value is Null) self.assertTrue(NULL | 1 is Null) self.assertTrue(1 | NULL is Null) NULL |= 1 self.assertTrue(NULL is Null) value = 1 value |= NULL self.assertTrue(value is Null) self.assertTrue(str(divmod(NULL, 1)) == '(, )') self.assertTrue(str(divmod(1, NULL)) == '(, )') self.assertTrue(NULL << 1 is Null) self.assertTrue(2 << NULL is Null) NULL <<=3 self.assertTrue(NULL is Null) value = 9 value <<= NULL self.assertTrue(value is Null) self.assertTrue(NULL >> 1 is Null) self.assertTrue(2 >> NULL is Null) NULL >>= 3 self.assertTrue(NULL is Null) value = 9 value >>= NULL self.assertTrue(value is Null) self.assertTrue(-NULL is Null) self.assertTrue(+NULL is Null) self.assertTrue(abs(NULL) is Null) self.assertTrue(~NULL is Null) self.assertTrue(NULL.attr is Null) self.assertTrue(NULL() is Null) self.assertTrue(getattr(NULL, 'fake') is Null) self.assertRaises(TypeError, hash, NULL) class TestLogical(TestCase): "Testing Logical" def test_unknown(self): "Unknown" for unk in '', '?', ' ', None, Null, Unknown, Other: huh = Logical(unk) self.assertEqual(huh == None, True, "huh is %r from %r, which is not None" % (huh, unk)) self.assertEqual(huh != None, False, "huh is %r from %r, which is not None" % (huh, unk)) self.assertEqual(huh != True, True, "huh is %r from %r, which is not None" % (huh, unk)) self.assertEqual(huh == True, False, "huh is %r from %r, which is not None" % (huh, unk)) self.assertEqual(huh != False, True, "huh is %r from %r, which is not None" % (huh, unk)) self.assertEqual(huh == False, False, "huh is %r from %r, which is not None" % (huh, unk)) self.assertRaises(ValueError, lambda : (0, 1, 2)[huh]) def test_true(self): "true" for true in 'True', 'yes', 't', 'Y', 7, ['blah']: huh = Logical(true) self.assertEqual(huh == True, True) self.assertEqual(huh != True, False) self.assertEqual(huh == False, False, "%r is not True" % true) self.assertEqual(huh != False, True) self.assertEqual(huh == None, False) self.assertEqual(huh != None, True) self.assertEqual((0, 1, 2)[huh], 1) def test_false(self): "false" for false in 'false', 'No', 'F', 'n', 0, []: huh = Logical(false) self.assertEqual(huh != False, False) self.assertEqual(huh == False, True) self.assertEqual(huh != True, True) self.assertEqual(huh == True, False) self.assertEqual(huh != None, True) self.assertEqual(huh == None, False) self.assertEqual((0, 1, 2)[huh], 0) def test_singletons(self): "singletons" heh = Logical(True) hah = Logical('Yes') ick = Logical(False) ack = Logical([]) unk = Logical('?') bla = Logical(None) self.assertEqual(heh is hah, True) self.assertEqual(ick is ack, True) self.assertEqual(unk is bla, True) def test_error(self): "errors" self.assertRaises(ValueError, Logical, 'wrong') def test_and(self): "and" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual((true & true) is true, True) self.assertEqual((true & false) is false, True) self.assertEqual((false & true) is false, True) self.assertEqual((false & false) is false, True) self.assertEqual((true & unknown) is unknown, True) self.assertEqual((false & unknown) is false, True) self.assertEqual((unknown & true) is unknown, True) self.assertEqual((unknown & false) is false, True) self.assertEqual((unknown & unknown) is unknown, True) self.assertEqual((true & True) is true, True) self.assertEqual((true & False) is false, True) self.assertEqual((false & True) is false, True) self.assertEqual((false & False) is false, True) self.assertEqual((true & None) is unknown, True) self.assertEqual((false & None) is false, True) self.assertEqual((unknown & True) is unknown, True) self.assertEqual((unknown & False) is false, True) self.assertEqual((unknown & None) is unknown, True) self.assertEqual((True & true) is true, True) self.assertEqual((True & false) is false, True) self.assertEqual((False & true) is false, True) self.assertEqual((False & false) is false, True) self.assertEqual((True & unknown) is unknown, True) self.assertEqual((False & unknown) is false, True) self.assertEqual((None & true) is unknown, True) self.assertEqual((None & false) is false, True) self.assertEqual((None & unknown) is unknown, True) self.assertEqual(type(true & 0), int) self.assertEqual(true & 0, 0) self.assertEqual(type(true & 3), int) self.assertEqual(true & 3, 1) self.assertEqual(type(false & 0), int) self.assertEqual(false & 0, 0) self.assertEqual(type(false & 2), int) self.assertEqual(false & 2, 0) self.assertEqual(type(unknown & 0), int) self.assertEqual(unknown & 0, 0) self.assertEqual(unknown & 2, unknown) t = true t &= true self.assertEqual(t is true, True) t = true t &= false self.assertEqual(t is false, True) f = false f &= true self.assertEqual(f is false, True) f = false f &= false self.assertEqual(f is false, True) t = true t &= unknown self.assertEqual(t is unknown, True) f = false f &= unknown self.assertEqual(f is false, True) u = unknown u &= true self.assertEqual(u is unknown, True) u = unknown u &= false self.assertEqual(u is false, True) u = unknown u &= unknown self.assertEqual(u is unknown, True) t = true t &= True self.assertEqual(t is true, True) t = true t &= False self.assertEqual(t is false, True) f = false f &= True self.assertEqual(f is false, True) f = false f &= False self.assertEqual(f is false, True) t = true t &= None self.assertEqual(t is unknown, True) f = false f &= None self.assertEqual(f is false, True) u = unknown u &= True self.assertEqual(u is unknown, True) u = unknown u &= False self.assertEqual(u is false, True) u = unknown u &= None self.assertEqual(u is unknown, True) t = True t &= true self.assertEqual(t is true, True) t = True t &= false self.assertEqual(t is false, True) f = False f &= true self.assertEqual(f is false, True) f = False f &= false self.assertEqual(f is false, True) t = True t &= unknown self.assertEqual(t is unknown, True) f = False f &= unknown self.assertEqual(f is false, True) u = None u &= true self.assertEqual(u is unknown, True) u = None u &= false self.assertEqual(u is false, True) u = None u &= unknown self.assertEqual(u is unknown, True) t = true t &= 0 self.assertEqual(type(true & 0), int) t = true t &= 0 self.assertEqual(true & 0, 0) t = true t &= 3 self.assertEqual(type(true & 3), int) t = true t &= 3 self.assertEqual(true & 3, 1) f = false f &= 0 self.assertEqual(type(false & 0), int) f = false f &= 0 self.assertEqual(false & 0, 0) f = false f &= 2 self.assertEqual(type(false & 2), int) f = false f &= 2 self.assertEqual(false & 2, 0) u = unknown u &= 0 self.assertEqual(type(unknown & 0), int) u = unknown u &= 0 self.assertEqual(unknown & 0, 0) u = unknown u &= 2 self.assertEqual(unknown & 2, unknown) def test_or(self): "or" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual((true | true) is true, True) self.assertEqual((true | false) is true, True) self.assertEqual((false | true) is true, True) self.assertEqual((false | false) is false, True) self.assertEqual((true | unknown) is true, True) self.assertEqual((false | unknown) is unknown, True) self.assertEqual((unknown | true) is true, True) self.assertEqual((unknown | false) is unknown, True) self.assertEqual((unknown | unknown) is unknown, True) self.assertEqual((true | True) is true, True) self.assertEqual((true | False) is true, True) self.assertEqual((false | True) is true, True) self.assertEqual((false | False) is false, True) self.assertEqual((true | None) is true, True) self.assertEqual((false | None) is unknown, True) self.assertEqual((unknown | True) is true, True) self.assertEqual((unknown | False) is unknown, True) self.assertEqual((unknown | None) is unknown, True) self.assertEqual((True | true) is true, True) self.assertEqual((True | false) is true, True) self.assertEqual((False | true) is true, True) self.assertEqual((False | false) is false, True) self.assertEqual((True | unknown) is true, True) self.assertEqual((False | unknown) is unknown, True) self.assertEqual((None | true) is true, True) self.assertEqual((None | false) is unknown, True) self.assertEqual((None | unknown) is unknown, True) self.assertEqual(type(true | 0), int) self.assertEqual(true | 0, 1) self.assertEqual(type(true | 2), int) self.assertEqual(true | 2, 3) self.assertEqual(type(false | 0), int) self.assertEqual(false | 0, 0) self.assertEqual(type(false | 2), int) self.assertEqual(false | 2, 2) self.assertEqual(unknown | 0, unknown) self.assertEqual(unknown | 2, unknown) t = true t |= true self.assertEqual(t is true, True) t = true t |= false self.assertEqual(t is true, True) f = false f |= true self.assertEqual(f is true, True) f = false f |= false self.assertEqual(f is false, True) t = true t |= unknown self.assertEqual(t is true, True) f = false f |= unknown self.assertEqual(f is unknown, True) u = unknown u |= true self.assertEqual(u is true, True) u = unknown u |= false self.assertEqual(u is unknown, True) u = unknown u |= unknown self.assertEqual(u is unknown, True) t = true t |= True self.assertEqual(t is true, True) t = true t |= False self.assertEqual(t is true, True) f = false f |= True self.assertEqual(f is true, True) f = false f |= False self.assertEqual(f is false, True) t = true t |= None self.assertEqual(t is true, True) f = false f |= None self.assertEqual(f is unknown, True) u = unknown u |= True self.assertEqual(u is true, True) u = unknown u |= False self.assertEqual(u is unknown, True) u = unknown u |= None self.assertEqual(u is unknown, True) t = True t |= true self.assertEqual(t is true, True) t = True t |= false self.assertEqual(t is true, True) f = False f |= true self.assertEqual(f is true, True) f = False f |= false self.assertEqual(f is false, True) t = True t |= unknown self.assertEqual(t is true, True) f = False f |= unknown self.assertEqual(f is unknown, True) u = None u |= true self.assertEqual(u is true, True) u = None u |= false self.assertEqual(u is unknown, True) u = None u |= unknown self.assertEqual(u is unknown, True) t = true t |= 0 self.assertEqual(type(t), int) t = true t |= 0 self.assertEqual(t, 1) t = true t |= 2 self.assertEqual(type(t), int) t = true t |= 2 self.assertEqual(t, 3) f = false f |= 0 self.assertEqual(type(f), int) f = false f |= 0 self.assertEqual(f, 0) f = false f |= 2 self.assertEqual(type(f), int) f = false f |= 2 self.assertEqual(f, 2) u = unknown u |= 0 self.assertEqual(u, unknown) def test_xor(self): "xor" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual((true ^ true) is false, True) self.assertEqual((true ^ false) is true, True) self.assertEqual((false ^ true) is true, True) self.assertEqual((false ^ false) is false, True) self.assertEqual((true ^ unknown) is unknown, True) self.assertEqual((false ^ unknown) is unknown, True) self.assertEqual((unknown ^ true) is unknown, True) self.assertEqual((unknown ^ false) is unknown, True) self.assertEqual((unknown ^ unknown) is unknown, True) self.assertEqual((true ^ True) is false, True) self.assertEqual((true ^ False) is true, True) self.assertEqual((false ^ True) is true, True) self.assertEqual((false ^ False) is false, True) self.assertEqual((true ^ None) is unknown, True) self.assertEqual((false ^ None) is unknown, True) self.assertEqual((unknown ^ True) is unknown, True) self.assertEqual((unknown ^ False) is unknown, True) self.assertEqual((unknown ^ None) is unknown, True) self.assertEqual((True ^ true) is false, True) self.assertEqual((True ^ false) is true, True) self.assertEqual((False ^ true) is true, True) self.assertEqual((False ^ false) is false, True) self.assertEqual((True ^ unknown) is unknown, True) self.assertEqual((False ^ unknown) is unknown, True) self.assertEqual((None ^ true) is unknown, True) self.assertEqual((None ^ false) is unknown, True) self.assertEqual((None ^ unknown) is unknown, True) self.assertEqual(type(true ^ 2), int) self.assertEqual(true ^ 2, 3) self.assertEqual(type(true ^ 0), int) self.assertEqual(true ^ 0, 1) self.assertEqual(type(false ^ 0), int) self.assertEqual(false ^ 0, 0) self.assertEqual(type(false ^ 2), int) self.assertEqual(false ^ 2, 2) self.assertEqual(unknown ^ 0, unknown) self.assertEqual(unknown ^ 2, unknown) t = true t ^= true self.assertEqual(t is false, True) t = true t ^= false self.assertEqual(t is true, True) f = false f ^= true self.assertEqual(f is true, True) f = false f ^= false self.assertEqual(f is false, True) t = true t ^= unknown self.assertEqual(t is unknown, True) f = false f ^= unknown self.assertEqual(f is unknown, True) u = unknown u ^= true self.assertEqual(u is unknown, True) u = unknown u ^= false self.assertEqual(u is unknown, True) u = unknown u ^= unknown self.assertEqual(u is unknown, True) t = true t ^= True self.assertEqual(t is false, True) t = true t ^= False self.assertEqual(t is true, True) f = false f ^= True self.assertEqual(f is true, True) f = false f ^= False self.assertEqual(f is false, True) t = true t ^= None self.assertEqual(t is unknown, True) f = false f ^= None self.assertEqual(f is unknown, True) u = unknown u ^= True self.assertEqual(u is unknown, True) u = unknown u ^= False self.assertEqual(u is unknown, True) u = unknown u ^= None self.assertEqual(u is unknown, True) t = True t ^= true self.assertEqual(t is false, True) t = True t ^= false self.assertEqual(t is true, True) f = False f ^= true self.assertEqual(f is true, True) f = False f ^= false self.assertEqual(f is false, True) t = True t ^= unknown self.assertEqual(t is unknown, True) f = False f ^= unknown self.assertEqual(f is unknown, True) u = None u ^= true self.assertEqual(u is unknown, True) u = None u ^= false self.assertEqual(u is unknown, True) u = None u ^= unknown self.assertEqual(u is unknown, True) t = true t ^= 0 self.assertEqual(type(true ^ 0), int) t = true t ^= 0 self.assertEqual(true ^ 0, 1) t = true t ^= 2 self.assertEqual(type(true ^ 2), int) t = true t ^= 2 self.assertEqual(true ^ 2, 3) f = false f ^= 0 self.assertEqual(type(false ^ 0), int) f = false f ^= 0 self.assertEqual(false ^ 0, 0) f = false f ^= 2 self.assertEqual(type(false ^ 2), int) f = false f ^= 2 self.assertEqual(false ^ 2, 2) u = unknown u ^= 0 self.assertEqual(unknown ^ 0, unknown) u = unknown u ^= 2 self.assertEqual(unknown ^ 2, unknown) def test_negation(self): "negation" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(-true, -1) self.assertEqual(-false, 0) self.assertEqual(-none, none) def test_posation(self): "posation" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(+true, 1) self.assertEqual(+false, 0) self.assertEqual(+none, none) def test_abs(self): "abs()" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(abs(true), 1) self.assertEqual(abs(false), 0) self.assertEqual(abs(none), none) def test_invert(self): "~ operator" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(~true, false) self.assertEqual(~false, true) self.assertEqual(~none, none) def test_complex(self): "complex" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(complex(true), complex(1)) self.assertEqual(complex(false), complex(0)) self.assertRaises(ValueError, complex, none) def test_int(self): "int" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(int(true), 1) self.assertEqual(int(false), 0) self.assertRaises(ValueError, int, none) if py_ver < (3, 0): def test_long(self): "long" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(long(true), long(1)) self.assertEqual(long(false), long(0)) self.assertRaises(ValueError, long, none) def test_float(self): "float" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(float(true), 1.0) self.assertEqual(float(false), 0.0) self.assertRaises(ValueError, float, none) def test_oct(self): "oct" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(oct(true), oct(1)) self.assertEqual(oct(false), oct(0)) self.assertRaises(ValueError, oct, none) def test_hex(self): "hex" true = Logical(True) false = Logical(False) none = Logical(None) self.assertEqual(hex(true), hex(1)) self.assertEqual(hex(false), hex(0)) self.assertRaises(ValueError, hex, none) def test_addition(self): "addition" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true + true, 2) self.assertEqual(true + false, 1) self.assertEqual(false + true, 1) self.assertEqual(false + false, 0) self.assertEqual(true + unknown, unknown) self.assertEqual(false + unknown, unknown) self.assertEqual(unknown + true, unknown) self.assertEqual(unknown + false, unknown) self.assertEqual(unknown + unknown, unknown) self.assertEqual(true + True, 2) self.assertEqual(true + False, 1) self.assertEqual(false + True, 1) self.assertEqual(false + False, 0) self.assertEqual(true + None, unknown) self.assertEqual(false + None, unknown) self.assertEqual(unknown + True, unknown) self.assertEqual(unknown + False, unknown) self.assertEqual(unknown + None, unknown) self.assertEqual(True + true, 2) self.assertEqual(True + false, 1) self.assertEqual(False + true, 1) self.assertEqual(False + false, 0) self.assertEqual(True + unknown, unknown) self.assertEqual(False + unknown, unknown) self.assertEqual(None + true, unknown) self.assertEqual(None + false, unknown) self.assertEqual(None + unknown, unknown) t = true t += true self.assertEqual(t, 2) t = true t += false self.assertEqual(t, 1) f = false f += true self.assertEqual(f, 1) f = false f += false self.assertEqual(f, 0) t = true t += unknown self.assertEqual(t, unknown) f = false f += unknown self.assertEqual(f, unknown) u = unknown u += true self.assertEqual(u, unknown) u = unknown u += false self.assertEqual(u, unknown) u = unknown u += unknown self.assertEqual(u, unknown) t = true t += True self.assertEqual(t, 2) t = true t += False self.assertEqual(t, 1) f = false f += True self.assertEqual(f, 1) f = false f += False self.assertEqual(f, 0) t = true t += None self.assertEqual(t, unknown) f = false f += None self.assertEqual(f, unknown) u = unknown u += True self.assertEqual(u, unknown) u = unknown u += False self.assertEqual(u, unknown) u = unknown u += None self.assertEqual(u, unknown) t = True t += true self.assertEqual(t, 2) t = True t += false self.assertEqual(t, 1) f = False f += true self.assertEqual(f, 1) f = False f += false self.assertEqual(f, 0) t = True t += unknown self.assertEqual(t, unknown) f = False f += unknown self.assertEqual(f, unknown) u = None u += true self.assertEqual(u, unknown) u = None u += false self.assertEqual(u, unknown) u = None u += unknown self.assertEqual(u, unknown) def test_multiplication(self): "multiplication" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true * true, 1) self.assertEqual(true * false, 0) self.assertEqual(false * true, 0) self.assertEqual(false * false, 0) self.assertEqual(true * unknown, unknown) self.assertEqual(false * unknown, 0) self.assertEqual(unknown * true, unknown) self.assertEqual(unknown * false, 0) self.assertEqual(unknown * unknown, unknown) self.assertEqual(true * True, 1) self.assertEqual(true * False, 0) self.assertEqual(false * True, 0) self.assertEqual(false * False, 0) self.assertEqual(true * None, unknown) self.assertEqual(false * None, 0) self.assertEqual(unknown * True, unknown) self.assertEqual(unknown * False, 0) self.assertEqual(unknown * None, unknown) self.assertEqual(True * true, 1) self.assertEqual(True * false, 0) self.assertEqual(False * true, 0) self.assertEqual(False * false, 0) self.assertEqual(True * unknown, unknown) self.assertEqual(False * unknown, 0) self.assertEqual(None * true, unknown) self.assertEqual(None * false, 0) self.assertEqual(None * unknown, unknown) t = true t *= true self.assertEqual(t, 1) t = true t *= false self.assertEqual(t, 0) f = false f *= true self.assertEqual(f, 0) f = false f *= false self.assertEqual(f, 0) t = true t *= unknown self.assertEqual(t, unknown) f = false f *= unknown self.assertEqual(f, 0) u = unknown u *= true self.assertEqual(u, unknown) u = unknown u *= false self.assertEqual(u, 0) u = unknown u *= unknown self.assertEqual(u, unknown) t = true t *= True self.assertEqual(t, 1) t = true t *= False self.assertEqual(t, 0) f = false f *= True self.assertEqual(f, 0) f = false f *= False self.assertEqual(f, 0) t = true t *= None self.assertEqual(t, unknown) f = false f *= None self.assertEqual(f, 0) u = unknown u *= True self.assertEqual(u, unknown) u = unknown u *= False self.assertEqual(u, 0) u = unknown u *= None self.assertEqual(u, unknown) t = True t *= true self.assertEqual(t, 1) t = True t *= false self.assertEqual(t, 0) f = False f *= true self.assertEqual(f, 0) f = False f *= false self.assertEqual(f, 0) t = True t *= unknown self.assertEqual(t, unknown) f = False f *= unknown self.assertEqual(f, 0) u = None u *= true self.assertEqual(u, unknown) u = None u *= false self.assertEqual(u, 0) u = None u *= unknown self.assertEqual(u, unknown) def test_subtraction(self): "subtraction" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true - true, 0) self.assertEqual(true - false, 1) self.assertEqual(false - true, -1) self.assertEqual(false - false, 0) self.assertEqual(true - unknown, unknown) self.assertEqual(false - unknown, unknown) self.assertEqual(unknown - true, unknown) self.assertEqual(unknown - false, unknown) self.assertEqual(unknown - unknown, unknown) self.assertEqual(true - True, 0) self.assertEqual(true - False, 1) self.assertEqual(false - True, -1) self.assertEqual(false - False, 0) self.assertEqual(true - None, unknown) self.assertEqual(false - None, unknown) self.assertEqual(unknown - True, unknown) self.assertEqual(unknown - False, unknown) self.assertEqual(unknown - None, unknown) self.assertEqual(True - true, 0) self.assertEqual(True - false, 1) self.assertEqual(False - true, -1) self.assertEqual(False - false, 0) self.assertEqual(True - unknown, unknown) self.assertEqual(False - unknown, unknown) self.assertEqual(None - true, unknown) self.assertEqual(None - false, unknown) self.assertEqual(None - unknown, unknown) t = true t -= true self.assertEqual(t, 0) t = true t -= false self.assertEqual(t, 1) f = false f -= true self.assertEqual(f, -1) f = false f -= false self.assertEqual(f, 0) t = true t -= unknown self.assertEqual(t, unknown) f = false f -= unknown self.assertEqual(f, unknown) u = unknown u -= true self.assertEqual(u, unknown) u = unknown u -= false self.assertEqual(u, unknown) u = unknown u -= unknown self.assertEqual(u, unknown) t = true t -= True self.assertEqual(t, 0) t = true t -= False self.assertEqual(t, 1) f = false f -= True self.assertEqual(f, -1) f = false f -= False self.assertEqual(f, 0) t = true t -= None self.assertEqual(t, unknown) f = false f -= None self.assertEqual(f, unknown) u = unknown u -= True self.assertEqual(u, unknown) u = unknown u -= False self.assertEqual(u, unknown) u = unknown u -= None self.assertEqual(u, unknown) t = True t -= true self.assertEqual(t, 0) t = True t -= false self.assertEqual(t, 1) f = False f -= true self.assertEqual(f, -1) f = False f -= false self.assertEqual(f, 0) t = True t -= unknown self.assertEqual(t, unknown) f = False f -= unknown self.assertEqual(f, unknown) u = None u -= true self.assertEqual(u, unknown) u = None u -= false self.assertEqual(u, unknown) u = None u -= unknown self.assertEqual(u, unknown) def test_division(self): "division" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true / true, 1) self.assertEqual(true / false, unknown) self.assertEqual(false / true, 0) self.assertEqual(false / false, unknown) self.assertEqual(true / unknown, unknown) self.assertEqual(false / unknown, unknown) self.assertEqual(unknown / true, unknown) self.assertEqual(unknown / false, unknown) self.assertEqual(unknown / unknown, unknown) self.assertEqual(true / True, 1) self.assertEqual(true / False, unknown) self.assertEqual(false / True, 0) self.assertEqual(false / False, unknown) self.assertEqual(true / None, unknown) self.assertEqual(false / None, unknown) self.assertEqual(unknown / True, unknown) self.assertEqual(unknown / False, unknown) self.assertEqual(unknown / None, unknown) self.assertEqual(True / true, 1) self.assertEqual(True / false, unknown) self.assertEqual(False / true, 0) self.assertEqual(False / false, unknown) self.assertEqual(True / unknown, unknown) self.assertEqual(False / unknown, unknown) self.assertEqual(None / true, unknown) self.assertEqual(None / false, unknown) self.assertEqual(None / unknown, unknown) t = true t /= true self.assertEqual(t, 1) t = true t /= false self.assertEqual(t, unknown) f = false f /= true self.assertEqual(f, 0) f = false f /= false self.assertEqual(f, unknown) t = true t /= unknown self.assertEqual(t, unknown) f = false f /= unknown self.assertEqual(f, unknown) u = unknown u /= true self.assertEqual(u, unknown) u = unknown u /= false self.assertEqual(u, unknown) u = unknown u /= unknown self.assertEqual(u, unknown) t = true t /= True self.assertEqual(t, 1) t = true t /= False self.assertEqual(t, unknown) f = false f /= True self.assertEqual(f, 0) f = false f /= False self.assertEqual(f, unknown) t = true t /= None self.assertEqual(t, unknown) f = false f /= None self.assertEqual(f, unknown) u = unknown u /= True self.assertEqual(u, unknown) u = unknown u /= False self.assertEqual(u, unknown) u = unknown u /= None self.assertEqual(u, unknown) t = True t /= true self.assertEqual(t, 1) t = True t /= false self.assertEqual(t, unknown) f = False f /= true self.assertEqual(f, 0) f = False f /= false self.assertEqual(f, unknown) t = True t /= unknown self.assertEqual(t, unknown) f = False f /= unknown self.assertEqual(f, unknown) u = None u /= true self.assertEqual(u, unknown) u = None u /= false self.assertEqual(u, unknown) u = None u /= unknown self.assertEqual(u, unknown) self.assertEqual(true // true, 1) self.assertEqual(true // false, unknown) self.assertEqual(false // true, 0) self.assertEqual(false // false, unknown) self.assertEqual(true // unknown, unknown) self.assertEqual(false // unknown, unknown) self.assertEqual(unknown // true, unknown) self.assertEqual(unknown // false, unknown) self.assertEqual(unknown // unknown, unknown) self.assertEqual(true // True, 1) self.assertEqual(true // False, unknown) self.assertEqual(false // True, 0) self.assertEqual(false // False, unknown) self.assertEqual(true // None, unknown) self.assertEqual(false // None, unknown) self.assertEqual(unknown // True, unknown) self.assertEqual(unknown // False, unknown) self.assertEqual(unknown // None, unknown) self.assertEqual(True // true, 1) self.assertEqual(True // false, unknown) self.assertEqual(False // true, 0) self.assertEqual(False // false, unknown) self.assertEqual(True // unknown, unknown) self.assertEqual(False // unknown, unknown) self.assertEqual(None // true, unknown) self.assertEqual(None // false, unknown) self.assertEqual(None // unknown, unknown) t = true t //= true self.assertEqual(t, 1) t = true t //= false self.assertEqual(t, unknown) f = false f //= true self.assertEqual(f, 0) f = false f //= false self.assertEqual(f, unknown) t = true t //= unknown self.assertEqual(t, unknown) f = false f //= unknown self.assertEqual(f, unknown) u = unknown u //= true self.assertEqual(u, unknown) u = unknown u //= false self.assertEqual(u, unknown) u = unknown u //= unknown self.assertEqual(u, unknown) t = true t //= True self.assertEqual(t, 1) t = true t //= False self.assertEqual(t, unknown) f = false f //= True self.assertEqual(f, 0) f = false f //= False self.assertEqual(f, unknown) t = true t //= None self.assertEqual(t, unknown) f = false f //= None self.assertEqual(f, unknown) u = unknown u //= True self.assertEqual(u, unknown) u = unknown u //= False self.assertEqual(u, unknown) u = unknown u //= None self.assertEqual(u, unknown) t = True t //= true self.assertEqual(t, 1) t = True t //= false self.assertEqual(t, unknown) f = False f //= true self.assertEqual(f, 0) f = False f //= false self.assertEqual(f, unknown) t = True t //= unknown self.assertEqual(t, unknown) f = False f //= unknown self.assertEqual(f, unknown) u = None u //= true self.assertEqual(u, unknown) u = None u //= false self.assertEqual(u, unknown) u = None u //= unknown self.assertEqual(u, unknown) def test_shift(self): "<< and >>" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true >> true, 0) self.assertEqual(true >> false, 1) self.assertEqual(false >> true, 0) self.assertEqual(false >> false, 0) self.assertEqual(true >> unknown, unknown) self.assertEqual(false >> unknown, unknown) self.assertEqual(unknown >> true, unknown) self.assertEqual(unknown >> false, unknown) self.assertEqual(unknown >> unknown, unknown) self.assertEqual(true >> True, 0) self.assertEqual(true >> False, 1) self.assertEqual(false >> True, 0) self.assertEqual(false >> False, 0) self.assertEqual(true >> None, unknown) self.assertEqual(false >> None, unknown) self.assertEqual(unknown >> True, unknown) self.assertEqual(unknown >> False, unknown) self.assertEqual(unknown >> None, unknown) self.assertEqual(True >> true, 0) self.assertEqual(True >> false, 1) self.assertEqual(False >> true, 0) self.assertEqual(False >> false, 0) self.assertEqual(True >> unknown, unknown) self.assertEqual(False >> unknown, unknown) self.assertEqual(None >> true, unknown) self.assertEqual(None >> false, unknown) self.assertEqual(None >> unknown, unknown) self.assertEqual(true << true, 2) self.assertEqual(true << false, 1) self.assertEqual(false << true, 0) self.assertEqual(false << false, 0) self.assertEqual(true << unknown, unknown) self.assertEqual(false << unknown, unknown) self.assertEqual(unknown << true, unknown) self.assertEqual(unknown << false, unknown) self.assertEqual(unknown << unknown, unknown) self.assertEqual(true << True, 2) self.assertEqual(true << False, 1) self.assertEqual(false << True, 0) self.assertEqual(false << False, 0) self.assertEqual(true << None, unknown) self.assertEqual(false << None, unknown) self.assertEqual(unknown << True, unknown) self.assertEqual(unknown << False, unknown) self.assertEqual(unknown << None, unknown) self.assertEqual(True << true, 2) self.assertEqual(True << false, 1) self.assertEqual(False << true, 0) self.assertEqual(False << false, 0) self.assertEqual(True << unknown, unknown) self.assertEqual(False << unknown, unknown) self.assertEqual(None << true, unknown) self.assertEqual(None << false, unknown) self.assertEqual(None << unknown, unknown) t = true t >>= true self.assertEqual(t, 0) t = true t >>= false self.assertEqual(t, 1) f = false f >>= true self.assertEqual(f, 0) f = false f >>= false self.assertEqual(f, 0) t = true t >>= unknown self.assertEqual(t, unknown) f = false f >>= unknown self.assertEqual(f, unknown) u = unknown u >>= true self.assertEqual(u, unknown) u = unknown u >>= false self.assertEqual(u, unknown) u = unknown u >>= unknown self.assertEqual(u, unknown) t = true t >>= True self.assertEqual(t, 0) t = true t >>= False self.assertEqual(t, 1) f = false f >>= True self.assertEqual(f, 0) f = false f >>= False self.assertEqual(f, 0) t = true t >>= None self.assertEqual(t, unknown) f = false f >>= None self.assertEqual(f, unknown) u = unknown u >>= True self.assertEqual(u, unknown) u = unknown u >>= False self.assertEqual(u, unknown) u = unknown u >>= None self.assertEqual(u, unknown) t = True t >>= true self.assertEqual(t, 0) t = True t >>= false self.assertEqual(t, 1) f = False f >>= true self.assertEqual(f, 0) f = False f >>= false self.assertEqual(f, 0) t = True t >>= unknown self.assertEqual(t, unknown) f = False f >>= unknown self.assertEqual(f, unknown) u = None u >>= true self.assertEqual(u, unknown) u = None u >>= false self.assertEqual(u, unknown) u = None u >>= unknown self.assertEqual(u, unknown) t = true t <<= true self.assertEqual(t, 2) t = true t <<= false self.assertEqual(t, 1) f = false f <<= true self.assertEqual(f, 0) f = false f <<= false self.assertEqual(f, 0) t = true t <<= unknown self.assertEqual(t, unknown) f = false f <<= unknown self.assertEqual(f, unknown) u = unknown u <<= true self.assertEqual(u, unknown) u = unknown u <<= false self.assertEqual(u, unknown) u = unknown u <<= unknown self.assertEqual(u, unknown) t = true t <<= True self.assertEqual(t, 2) t = true t <<= False self.assertEqual(t, 1) f = false f <<= True self.assertEqual(f, 0) f = false f <<= False self.assertEqual(f, 0) t = true t <<= None self.assertEqual(t, unknown) f = false f <<= None self.assertEqual(f, unknown) u = unknown u <<= True self.assertEqual(u, unknown) u = unknown u <<= False self.assertEqual(u, unknown) u = unknown u <<= None self.assertEqual(u, unknown) t = True t <<= true self.assertEqual(t, 2) t = True t <<= false self.assertEqual(t, 1) f = False f <<= true self.assertEqual(f, 0) f = False f <<= false self.assertEqual(f, 0) t = True t <<= unknown self.assertEqual(t, unknown) f = False f <<= unknown self.assertEqual(f, unknown) u = None u <<= true self.assertEqual(u, unknown) u = None u <<= false self.assertEqual(u, unknown) u = None u <<= unknown self.assertEqual(u, unknown) def test_pow(self): "**" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true ** true, 1) self.assertEqual(true ** false, 1) self.assertEqual(false ** true, 0) self.assertEqual(false ** false, 1) self.assertEqual(true ** unknown, unknown) self.assertEqual(false ** unknown, unknown) self.assertEqual(unknown ** true, unknown) self.assertEqual(unknown ** false, 1) self.assertEqual(unknown ** unknown, unknown) self.assertEqual(true ** True, 1) self.assertEqual(true ** False, 1) self.assertEqual(false ** True, 0) self.assertEqual(false ** False, 1) self.assertEqual(true ** None, unknown) self.assertEqual(false ** None, unknown) self.assertEqual(unknown ** True, unknown) self.assertEqual(unknown ** False, 1) self.assertEqual(unknown ** None, unknown) self.assertEqual(True ** true, 1) self.assertEqual(True ** false, 1) self.assertEqual(False ** true, 0) self.assertEqual(False ** false, 1) self.assertEqual(True ** unknown, unknown) self.assertEqual(False ** unknown, unknown) self.assertEqual(None ** true, unknown) self.assertEqual(None ** false, 1) self.assertEqual(None ** unknown, unknown) t = true t **= true self.assertEqual(t, 1) t = true t **= false self.assertEqual(t, 1) f = false f **= true self.assertEqual(f, 0) f = false f **= false self.assertEqual(f, 1) t = true t **= unknown self.assertEqual(t, unknown) f = false f **= unknown self.assertEqual(f, unknown) u = unknown u **= true self.assertEqual(u, unknown) u = unknown u **= false self.assertEqual(u, 1) u = unknown u **= unknown self.assertEqual(u, unknown) t = true t **= True self.assertEqual(t, 1) t = true t **= False self.assertEqual(t, 1) f = false f **= True self.assertEqual(f, 0) f = false f **= False self.assertEqual(f, 1) t = true t **= None self.assertEqual(t, unknown) f = false f **= None self.assertEqual(f, unknown) u = unknown u **= True self.assertEqual(u, unknown) u = unknown u **= False self.assertEqual(u, 1) u = unknown u **= None self.assertEqual(u, unknown) t = True t **= true self.assertEqual(t, 1) t = True t **= false self.assertEqual(t, 1) f = False f **= true self.assertEqual(f, 0) f = False f **= false self.assertEqual(f, 1) t = True t **= unknown self.assertEqual(t, unknown) f = False f **= unknown self.assertEqual(f, unknown) u = None u **= true self.assertEqual(u, unknown) u = None u **= false self.assertEqual(u, 1) u = None u **= unknown self.assertEqual(u, unknown) def test_mod(self): "%" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(true % true, 0) self.assertEqual(true % false, unknown) self.assertEqual(false % true, 0) self.assertEqual(false % false, unknown) self.assertEqual(true % unknown, unknown) self.assertEqual(false % unknown, unknown) self.assertEqual(unknown % true, unknown) self.assertEqual(unknown % false, unknown) self.assertEqual(unknown % unknown, unknown) self.assertEqual(true % True, 0) self.assertEqual(true % False, unknown) self.assertEqual(false % True, 0) self.assertEqual(false % False, unknown) self.assertEqual(true % None, unknown) self.assertEqual(false % None, unknown) self.assertEqual(unknown % True, unknown) self.assertEqual(unknown % False, unknown) self.assertEqual(unknown % None, unknown) self.assertEqual(True % true, 0) self.assertEqual(True % false, unknown) self.assertEqual(False % true, 0) self.assertEqual(False % false, unknown) self.assertEqual(True % unknown, unknown) self.assertEqual(False % unknown, unknown) self.assertEqual(None % true, unknown) self.assertEqual(None % false, unknown) self.assertEqual(None % unknown, unknown) t = true t %= true self.assertEqual(t, 0) t = true t %= false self.assertEqual(t, unknown) f = false f %= true self.assertEqual(f, 0) f = false f %= false self.assertEqual(f, unknown) t = true t %= unknown self.assertEqual(t, unknown) f = false f %= unknown self.assertEqual(f, unknown) u = unknown u %= true self.assertEqual(u, unknown) u = unknown u %= false self.assertEqual(u, unknown) u = unknown u %= unknown self.assertEqual(u, unknown) t = true t %= True self.assertEqual(t, 0) t = true t %= False self.assertEqual(t, unknown) f = false f %= True self.assertEqual(f, 0) f = false f %= False self.assertEqual(f, unknown) t = true t %= None self.assertEqual(t, unknown) f = false f %= None self.assertEqual(f, unknown) u = unknown u %= True self.assertEqual(u, unknown) u = unknown u %= False self.assertEqual(u, unknown) u = unknown u %= None self.assertEqual(u, unknown) t = True t %= true self.assertEqual(t, 0) t = True t %= false self.assertEqual(t, unknown) f = False f %= true self.assertEqual(f, 0) f = False f %= false self.assertEqual(f, unknown) t = True t %= unknown self.assertEqual(t, unknown) f = False f %= unknown self.assertEqual(f, unknown) u = None u %= true self.assertEqual(u, unknown) u = None u %= false self.assertEqual(u, unknown) u = None u %= unknown self.assertEqual(u, unknown) def test_divmod(self): "divmod()" true = Logical(True) false = Logical(False) unknown = Logical(None) self.assertEqual(divmod(true, true), (1, 0)) self.assertEqual(divmod(true, false), (unknown, unknown)) self.assertEqual(divmod(false, true), (0, 0)) self.assertEqual(divmod(false, false), (unknown, unknown)) self.assertEqual(divmod(true, unknown), (unknown, unknown)) self.assertEqual(divmod(false, unknown), (unknown, unknown)) self.assertEqual(divmod(unknown, true), (unknown, unknown)) self.assertEqual(divmod(unknown, false), (unknown, unknown)) self.assertEqual(divmod(unknown, unknown), (unknown, unknown)) self.assertEqual(divmod(true, True), (1, 0)) self.assertEqual(divmod(true, False), (unknown, unknown)) self.assertEqual(divmod(false, True), (0, 0)) self.assertEqual(divmod(false, False), (unknown, unknown)) self.assertEqual(divmod(true, None), (unknown, unknown)) self.assertEqual(divmod(false, None), (unknown, unknown)) self.assertEqual(divmod(unknown, True), (unknown, unknown)) self.assertEqual(divmod(unknown, False), (unknown, unknown)) self.assertEqual(divmod(unknown, None), (unknown, unknown)) self.assertEqual(divmod(True, true), (1, 0)) self.assertEqual(divmod(True, false), (unknown, unknown)) self.assertEqual(divmod(False, true), (0, 0)) self.assertEqual(divmod(False, false), (unknown, unknown)) self.assertEqual(divmod(True, unknown), (unknown, unknown)) self.assertEqual(divmod(False, unknown), (unknown, unknown)) self.assertEqual(divmod(None, true), (unknown, unknown)) self.assertEqual(divmod(None, false), (unknown, unknown)) self.assertEqual(divmod(None, unknown), (unknown, unknown)) class TestQuantum(TestCase): "Testing Quantum" def test_exceptions(self): "errors" self.assertRaises(ValueError, Quantum, 'wrong') self.assertRaises(TypeError, lambda : (0, 1, 2)[On]) self.assertRaises(TypeError, lambda : (0, 1, 2)[Off]) self.assertRaises(TypeError, lambda : (0, 1, 2)[Other]) def test_other(self): "Other" huh = unknown = Quantum('') self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum('?') self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum(' ') self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum(None) self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum(Null()) self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum(Other) self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) huh = Quantum(Unknown) self.assertEqual(huh is dbf.Other, True) self.assertEqual((huh != huh) is unknown, True) self.assertEqual((huh != True) is unknown, True) self.assertEqual((huh != False) is unknown, True) def test_true(self): "true" huh = Quantum('True') unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) huh = Quantum('yes') unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) huh = Quantum('t') unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) huh = Quantum('Y') unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) huh = Quantum(7) unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) huh = Quantum(['blah']) unknown = Quantum('?') self.assertEqual(huh, True) self.assertNotEqual(huh, False) self.assertEqual((huh != None) is unknown, True) def test_false(self): "false" huh = Quantum('false') unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) huh = Quantum('No') unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) huh = Quantum('F') unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) huh = Quantum('n') unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) huh = Quantum(0) unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) huh = Quantum([]) unknown = Quantum('?') self.assertEqual(huh, False) self.assertNotEqual(huh, True) self.assertEqual((huh != None) is unknown, True) def test_singletons(self): "singletons" heh = Quantum(True) hah = Quantum('Yes') ick = Quantum(False) ack = Quantum([]) unk = Quantum('?') bla = Quantum(None) self.assertEqual(heh is hah, True) self.assertEqual(ick is ack, True) self.assertEqual(unk is bla, True) def test_or(self): "or" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) self.assertEqual(true + true, true) self.assertEqual(true + false, true) self.assertEqual(false + true, true) self.assertEqual(false + false, false) self.assertEqual(true + unknown, true) self.assertEqual(false + unknown is unknown, True) self.assertEqual(unknown + unknown is unknown, True) self.assertEqual(true | true, true) self.assertEqual(true | false, true) self.assertEqual(false | true, true) self.assertEqual(false | false, false) self.assertEqual(true | unknown, true) self.assertEqual(false | unknown is unknown, True) self.assertEqual(unknown | unknown is unknown, True) self.assertEqual(true + True, true) self.assertEqual(true + False, true) self.assertEqual(false + True, true) self.assertEqual(false + False, false) self.assertEqual(true + None, true) self.assertEqual(false + None is unknown, True) self.assertEqual(unknown + None is unknown, True) self.assertEqual(true | True, true) self.assertEqual(true | False, true) self.assertEqual(false | True, true) self.assertEqual(false | False, false) self.assertEqual(true | None, true) self.assertEqual(false | None is unknown, True) self.assertEqual(unknown | None is unknown, True) self.assertEqual(True + true, true) self.assertEqual(True + false, true) self.assertEqual(False + true, true) self.assertEqual(False + false, false) self.assertEqual(True + unknown, true) self.assertEqual(False + unknown is unknown, True) self.assertEqual(None + unknown is unknown, True) self.assertEqual(True | true, true) self.assertEqual(True | false, true) self.assertEqual(False | true, true) self.assertEqual(False | false, false) self.assertEqual(True | unknown, true) self.assertEqual(False | unknown is unknown, True) self.assertEqual(None | unknown is unknown, True) def test_and(self): "and" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) self.assertEqual(true * true, true) self.assertEqual(true * false, false) self.assertEqual(false * true, false) self.assertEqual(false * false, false) self.assertEqual(true * unknown is unknown, True) self.assertEqual(false * unknown, false) self.assertEqual(unknown * unknown is unknown, True) self.assertEqual(true & true, true) self.assertEqual(true & false, false) self.assertEqual(false & true, false) self.assertEqual(false & false, false) self.assertEqual(true & unknown is unknown, True) self.assertEqual(false & unknown, false) self.assertEqual(unknown & unknown is unknown, True) self.assertEqual(true * True, true) self.assertEqual(true * False, false) self.assertEqual(false * True, false) self.assertEqual(false * False, false) self.assertEqual(true * None is unknown, True) self.assertEqual(false * None, false) self.assertEqual(unknown * None is unknown, True) self.assertEqual(true & True, true) self.assertEqual(true & False, false) self.assertEqual(false & True, false) self.assertEqual(false & False, false) self.assertEqual(true & None is unknown, True) self.assertEqual(false & None, false) self.assertEqual(unknown & None is unknown, True) self.assertEqual(True * true, true) self.assertEqual(True * false, false) self.assertEqual(False * true, false) self.assertEqual(False * false, false) self.assertEqual(True * unknown is unknown, True) self.assertEqual(False * unknown, false) self.assertEqual(None * unknown is unknown, True) self.assertEqual(True & true, true) self.assertEqual(True & false, false) self.assertEqual(False & true, false) self.assertEqual(False & false, false) self.assertEqual(True & unknown is unknown, True) self.assertEqual(False & unknown, false) self.assertEqual(None & unknown is unknown, True) def test_xor(self): "xor" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) self.assertEqual(true ^ true, false) self.assertEqual(true ^ false, true) self.assertEqual(false ^ true, true) self.assertEqual(false ^ false, false) self.assertEqual(true ^ unknown is unknown, True) self.assertEqual(false ^ unknown is unknown, True) self.assertEqual(unknown ^ unknown is unknown, True) self.assertEqual(true ^ True, false) self.assertEqual(true ^ False, true) self.assertEqual(false ^ True, true) self.assertEqual(false ^ False, false) self.assertEqual(true ^ None is unknown, True) self.assertEqual(false ^ None is unknown, True) self.assertEqual(unknown ^ None is unknown, True) self.assertEqual(True ^ true, false) self.assertEqual(True ^ false, true) self.assertEqual(False ^ true, true) self.assertEqual(False ^ false, false) self.assertEqual(True ^ unknown is unknown, True) self.assertEqual(False ^ unknown is unknown, True) self.assertEqual(None ^ unknown is unknown, True) def test_implication_material(self): "implication, material" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) self.assertEqual(true >> true, true) self.assertEqual(true >> false, false) self.assertEqual(false >> true, true) self.assertEqual(false >> false, true) self.assertEqual(true >> unknown is unknown, True) self.assertEqual(false >> unknown, true) self.assertEqual(unknown >> unknown is unknown, True) self.assertEqual(true >> True, true) self.assertEqual(true >> False, false) self.assertEqual(false >> True, true) self.assertEqual(false >> False, true) self.assertEqual(true >> None is unknown, True) self.assertEqual(false >> None, true) self.assertEqual(unknown >> None is unknown, True) self.assertEqual(True >> true, true) self.assertEqual(True >> false, false) self.assertEqual(False >> true, true) self.assertEqual(False >> false, true) self.assertEqual(True >> unknown is unknown, True) self.assertEqual(False >> unknown, true) self.assertEqual(None >> unknown is unknown, True) def test_implication_relevant(self): "implication, relevant" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) Quantum.set_implication('relevant') self.assertEqual(true >> true, true) self.assertEqual(true >> false, false) self.assertEqual(false >> true is unknown, True) self.assertEqual(false >> false is unknown, True) self.assertEqual(true >> unknown is unknown, True) self.assertEqual(false >> unknown is unknown, True) self.assertEqual(unknown >> unknown is unknown, True) self.assertEqual(true >> True, true) self.assertEqual(true >> False, false) self.assertEqual(false >> True is unknown, True) self.assertEqual(false >> False is unknown, True) self.assertEqual(true >> None is unknown, True) self.assertEqual(false >> None is unknown, True) self.assertEqual(unknown >> None is unknown, True) self.assertEqual(True >> true, true) self.assertEqual(True >> false, false) self.assertEqual(False >> true is unknown, True) self.assertEqual(False >> false is unknown, True) self.assertEqual(True >> unknown is unknown, True) self.assertEqual(False >> unknown is unknown, True) self.assertEqual(None >> unknown is unknown, True) def test_nand(self): "negative and" true = Quantum(True) false = Quantum(False) unknown = Quantum(None) self.assertEqual(true.D(true), false) self.assertEqual(true.D(false), true) self.assertEqual(false.D(true), true) self.assertEqual(false.D(false), true) self.assertEqual(true.D(unknown) is unknown, True) self.assertEqual(false.D(unknown), true) self.assertEqual(unknown.D(unknown) is unknown, True) self.assertEqual(true.D(True), false) self.assertEqual(true.D(False), true) self.assertEqual(false.D(True), true) self.assertEqual(false.D(False), true) self.assertEqual(true.D(None) is unknown, True) self.assertEqual(false.D(None), true) self.assertEqual(unknown.D(None) is unknown, True) def test_negation(self): "negation" true = Quantum(True) false = Quantum(False) none = Quantum(None) self.assertEqual(-true, false) self.assertEqual(-false, true) self.assertEqual(-none is none, True) class TestExceptions(TestCase): def test_bad_field_specs_on_creation(self): self.assertRaises(FieldSpecError, Table, 'blah', 'age N(3,2)', on_disk=False) self.assertRaises(FieldSpecError, Table, 'blah', 'name C(300)', on_disk=False) self.assertRaises(FieldSpecError, Table, 'blah', 'born L(9)', on_disk=False) self.assertRaises(FieldSpecError, Table, 'blah', 'married D(12)', on_disk=False) self.assertRaises(FieldSpecError, Table, 'blah', 'desc M(1)', on_disk=False) self.assertRaises(FieldSpecError, Table, 'blah', 'desc', on_disk=False) def test_too_many_fields_on_creation(self): fields = [] for i in range(255): fields.append('a%03d C(10)' % i) Table(':test:', ';'.join(fields), on_disk=False) fields.append('a255 C(10)') self.assertRaises(DbfError, Table, ':test:', ';'.join(fields), on_disk=False) def test_adding_too_many_fields(self): fields = [] for i in range(255): fields.append('a%03d C(10)' % i) table = Table(':test:', ';'.join(fields), on_disk=False) table.open(mode=READ_WRITE) self.assertRaises(DbfError, table.add_fields, 'a255 C(10)') def test_adding_too_many_fields_with_null(self): fields = [] for i in range(254): fields.append(u'a%03d C(10) NULL' % i) table = Table(':test:', u';'.join(fields), dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) self.assertRaises(DbfError, table.add_fields, u'a255 C(10)') fields = [] for i in range(254): fields.append(u'a%03d C(10) NULL' % i) table = Table(':test:', u';'.join(fields), dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) self.assertRaises(DbfError, table.add_fields, u'a255 C(10)') def test_too_many_records_in_table(self): "skipped -- test takes waaaaaaay too long" def test_too_many_fields_to_change_to_null(self): fields = [] for i in range(255): fields.append('a%03d C(10)' % i) table = Table(':test:', ';'.join(fields), on_disk=False) table.open(mode=READ_WRITE) try: self.assertRaises(DbfError, table.allow_nulls, 'a001') finally: table.close() def test_adding_existing_field_to_table(self): table = Table(':blah:', 'name C(50)', on_disk=False) self.assertRaises(DbfError, table.add_fields, 'name C(10)') def test_deleting_non_existing_field_from_table(self): table = Table(':bleh:', 'name C(25)', on_disk=False) self.assertRaises(DbfError, table.delete_fields, 'age') def test_modify_packed_record(self): table = Table(':ummm:', 'name C(3); age N(3,0)', on_disk=False) table.open(mode=READ_WRITE) for person in (('me', 25), ('you', 35), ('her', 29)): table.append(person) record = table[1] dbf.delete(record) table.pack() self.assertEqual(('you', 35), record) self.assertRaises(DbfError, dbf.write, record, **{'age':33}) def test_read_only(self): table = Table(':ahhh:', 'name C(10)', on_disk=False) table.open(mode=dbf.READ_ONLY) self.assertRaises(DbfError, table.append, dict(name='uh uh!')) def test_clipper(self): Table(os.path.join(tempdir, 'temptable'), 'name C(377); thesis C(20179)', dbf_type='clp') self.assertRaises(BadDataError, Table, os.path.join(tempdir, 'temptable')) def test_data_overflow(self): table = Table(os.path.join(tempdir, 'temptable'), 'mine C(2); yours C(15)') table.open(mode=READ_WRITE) table.append(('me',)) try: table.append(('yours',)) except DataOverflowError: pass finally: table.close() def test_change_null_field(self): "cannot making an existing field nullable" table = Table( os.path.join(tempdir, 'vfp_table'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' + ' weight F(18,3); age I; meeting T; misc G; photo P; price Y;' + ' dist B', dbf_type='vfp', default_data_types='enhanced', ) table.open(mode=READ_WRITE) namelist = [] paidlist = [] qtylist = [] orderlist = [] desclist = [] for i in range(10): name = words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) table.append({u'name':name, u'paid':paid, u'qty':qty, u'orderdate':orderdate, u'desc':desc}) # plus a blank record namelist.append('') paidlist.append(None) qtylist.append(None) orderlist.append(None) desclist.append('') table.append() for field in table.field_names: self.assertEqual(table.nullable_field(field), False) self.assertRaises(DbfError, table.allow_nulls, (u'name, qty')) table.close() class TestWarnings(TestCase): def test_field_name_warning(self): with warnings.catch_warnings(record=True) as w: huh = dbf.Table('cloud', 'p^type C(25)', on_disk=False).open(dbf.READ_WRITE) self.assertEqual(len(w), 1, str(w)) warning = w[-1] self.assertTrue(issubclass(warning.category, dbf.FieldNameWarning)) huh.resize_field('p^type', 30) self.assertEqual(len(w), 1, 'warning objects\n'+'\n'.join([str(warning) for warning in w])) huh.add_fields('c^word C(50)') self.assertEqual(len(w), 2, str(w)) warning = w[-1] self.assertTrue(issubclass(warning.category, dbf.FieldNameWarning)) class TestIndexLocation(TestCase): def test_false(self): self.assertFalse(IndexLocation(0, False)) self.assertFalse(IndexLocation(42, False)) def test_true(self): self.assertTrue(IndexLocation(0, True)) self.assertTrue(IndexLocation(42, True)) class TestDbfCreation(TestCase): "Testing table creation..." def test_db3_memory_tables(self): "dbf tables in memory" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(':memory:', fieldlist, dbf_type='db3', on_disk=False) actualFields = table.structure() self.assertEqual(fieldlist, actualFields) self.assertTrue(all([type(x) is unicode for x in table.field_names])) def test_db3_disk_tables(self): "dbf table on disk" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(os.path.join(tempdir, 'temptable'), ';'.join(fieldlist), dbf_type='db3') table = Table(os.path.join(tempdir, 'temptable'), dbf_type='db3') actualFields = table.structure() self.assertEqual(fieldlist, actualFields) table = open(table.filename, 'rb') try: last_byte = ord(table.read()[-1]) finally: table.close() self.assertEqual(last_byte, EOF) def test_clp_memory_tables(self): "clp tables in memory" fields = unicodify(['name C(10977)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(':memory:', fieldlist, dbf_type='clp', on_disk=False) actualFields = table.structure() self.assertEqual(fieldlist, actualFields) self.assertTrue(all([type(x) is unicode for x in table.field_names])) def test_clp_disk_tables(self): "clp table on disk" table = Table(os.path.join(tempdir, 'temptable'), u'name C(377); thesis C(20179)', dbf_type='clp') self.assertEqual(table.record_length, 20557) fields = unicodify(['name C(10977)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(os.path.join(tempdir, 'temptable'), u';'.join(fieldlist), dbf_type='clp') table = Table(os.path.join(tempdir, 'temptable'), dbf_type='clp') actualFields = table.structure() self.assertEqual(fieldlist, actualFields) table = open(table.filename, 'rb') try: last_byte = ord(table.read()[-1]) finally: table.close() self.assertEqual(last_byte, EOF) def test_fp_memory_tables(self): "fp tables in memory" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'litres F(11,5)', 'blob G', 'graphic P', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(':memory:', u';'.join(fieldlist), dbf_type='fp', on_disk=False) actualFields = table.structure() self.assertEqual(fieldlist, actualFields) def test_fp_disk_tables(self): "fp tables on disk" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'litres F(11,5)', 'blob G', 'graphic P', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(os.path.join(tempdir, 'tempfp'), u';'.join(fieldlist), dbf_type='fp') table = Table(os.path.join(tempdir, 'tempfp'), dbf_type='fp') actualFields = table.structure() self.assertEqual(fieldlist, actualFields) def test_vfp_memory_tables(self): "vfp tables in memory" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'mass B', 'litres F(11,5)', 'int I', 'birth T', 'blob G', 'graphic P', 'menu C(50) BINARY', 'graduated L NULL', 'fired D NULL', 'cipher C(50) NOCPTRANS NULL', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(':memory:', u';'.join(fieldlist), dbf_type='vfp', on_disk=False) actualFields = table.structure() fieldlist = [f.replace('NOCPTRANS','BINARY') for f in fieldlist] self.assertEqual(fieldlist, actualFields) def test_vfp_disk_tables(self): "vfp tables on disk" fields = unicodify(['name C(25)', 'hiredate D', 'male L', 'wisdom M', 'qty N(3,0)', 'mass B', 'litres F(11,5)', 'int I', 'birth T', 'blob G', 'graphic P', 'menu C(50) binary', 'graduated L null', 'fired D NULL', 'cipher C(50) nocptrans NULL', 'weight F(7,3)']) for i in range(1, len(fields)+1): for fieldlist in combinate(fields, i): table = Table(os.path.join(tempdir, 'tempvfp'), u';'.join(fieldlist), dbf_type='vfp') table = Table(os.path.join(tempdir, 'tempvfp'), dbf_type='vfp') actualFields = table.structure() fieldlist = [f.replace('nocptrans','BINARY') for f in fieldlist] self.assertEqual(fieldlist, actualFields) def test_codepage(self): table = Table(os.path.join(tempdir, 'tempvfp'), u'name C(25); male L; fired D NULL', dbf_type='vfp') table.close() self.assertEqual(dbf.default_codepage, 'ascii') self.assertEqual(table.codepage, dbf.CodePage('ascii')) table.close() table.open(mode=READ_WRITE) table.close() table = Table(os.path.join(tempdir, 'tempvfp'), u'name C(25); male L; fired D NULL', dbf_type='vfp', codepage='cp850') table.close() self.assertEqual(table.codepage, dbf.CodePage('cp850')) newtable = table.new('tempvfp2', codepage='cp437') self.assertEqual(newtable.codepage, dbf.CodePage('cp437')) newtable.open(mode=READ_WRITE) newtable.create_backup() newtable.close() bckup = Table(os.path.join(tempdir, newtable.backup)) self.assertEqual(bckup.codepage, newtable.codepage) def test_db3_ignore_memos(self): table = Table(os.path.join(tempdir, 'tempdb3'), u'name C(25); wisdom M', dbf_type='db3').open(mode=READ_WRITE) table.append(('QC Tester', 'check it twice! check it thrice! check it . . . uh . . . again!')) table.close() table = Table(os.path.join(tempdir, 'tempdb3'), dbf_type='db3', ignore_memos=True) table.open(mode=READ_WRITE) try: self.assertEqual(table[0].wisdom, u'') finally: table.close() def test_fp_ignore_memos(self): table = Table(os.path.join(tempdir, 'tempdb3'), u'name C(25); wisdom M', dbf_type='fp').open(mode=READ_WRITE) table.append(('QC Tester', 'check it twice! check it thrice! check it . . . uh . . . again!')) table.close() table = Table(os.path.join(tempdir, 'tempdb3'), dbf_type='fp', ignore_memos=True) table.open(mode=READ_WRITE) try: self.assertEqual(table[0].wisdom, u'') finally: table.close() def test_vfp_ignore_memos(self): table = Table(os.path.join(tempdir, 'tempdb3'), u'name C(25); wisdom M', dbf_type='vfp').open(mode=READ_WRITE) table.append(('QC Tester', 'check it twice! check it thrice! check it . . . uh . . . again!')) table.close() table = Table(os.path.join(tempdir, 'tempdb3'), dbf_type='vfp', ignore_memos=True) table.open(mode=READ_WRITE) try: self.assertEqual(table[0].wisdom, u'') finally: table.close() def test_clp_ignore_memos(self): table = Table(os.path.join(tempdir, 'tempdb3'), u'name C(25); wisdom M', dbf_type='clp').open(mode=READ_WRITE) table.append(('QC Tester', 'check it twice! check it thrice! check it . . . uh . . . again!')) table.close() table = Table(os.path.join(tempdir, 'tempdb3'), dbf_type='clp', ignore_memos=True) table.open(mode=READ_WRITE) try: self.assertEqual(table[0].wisdom, u'') finally: table.close() class TestDbfRecords(TestCase): "Testing records" def setUp(self): self.dbf_table = Table( os.path.join(tempdir, 'dbf_table'), u'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.vfp_table = Table( os.path.join(tempdir, 'vfp_table'), u'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' + u' weight F(18,3); age I; meeting T; misc G; photo P; price Y;' + u' dist B', dbf_type='vfp', default_data_types='enhanced', ) self.null_vfp_table = null_table = Table( os.path.join(tempdir, 'null_vfp_table'), 'first C(25) null; last C(25); height N(3,1) null; age N(3,0); life_story M null; plans M', dbf_type='vfp', ) null_table.open(dbf.READ_WRITE) null_table.append() null_table.close() def tearDown(self): self.dbf_table.close() self.vfp_table.close() self.null_vfp_table.close() def test_slicing(self): table = self.dbf_table table.open(mode=READ_WRITE) table.append(('myself', True, 5.97, dbf.Date(2012, 5, 21), 'really cool')) self.assertEqual(table.first_record[u'name':u'qty'], table[0][:3]) def test_dbf_adding_records(self): "dbf table: adding records" table = self.dbf_table table.open(mode=READ_WRITE) namelist = [] paidlist = [] qtylist = [] orderlist = [] desclist = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) table.append(unicodify({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc})) record = table[-1] t = open(table.filename, 'rb') last_byte = ord(t.read()[-1]) t.close() self.assertEqual(last_byte, EOF) self.assertEqual(record.name.strip(), name) self.assertEqual(record.paid, paid) self.assertEqual(record.qty, round(qty, 5)) self.assertEqual(record.orderdate, orderdate) self.assertEqual(record.desc.strip(), desc) # plus a blank record namelist.append('') paidlist.append(None) qtylist.append(None) orderlist.append(None) desclist.append('') blank_record = table.append() self.assertEqual(len(table), len(floats)+1) for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) table.close() t = open(table.filename, 'rb') last_byte = ord(t.read()[-1]) t.close() self.assertEqual(last_byte, EOF) table = Table(table.filename, dbf_type='db3') table.open(mode=READ_WRITE) self.assertEqual(len(table), len(floats)+1) for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) i = 0 for record in table[:-1]: i += 1 continue self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name.strip(), namelist[i]) self.assertEqual(record.name.strip(), namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc.strip(), desclist[i]) self.assertEqual(record.desc.strip(), desclist[i]) i += 1 record = table[-1] self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name.strip(), namelist[i]) self.assertEqual(record.name.strip(), namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(table[i].qty, qtylist[i]) self.assertEqual(record.qty, qtylist[i]) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) i += 1 self.assertEqual(i, len(table)) table.close() def test_vfp_adding_records(self): "vfp table: adding records" table = self.vfp_table table.open(mode=READ_WRITE) namelist = [] paidlist = [] qtylist = [] orderlist = [] desclist = [] masslist = [] weightlist = [] agelist = [] meetlist = [] misclist = [] photolist = [] pricelist = [] distlist = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) mass = floats[i] * floats[i] / 2.0 weight = floats[i] * 3 dist = floats[i] * 2 age = numbers[i] meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) misc = (' '.join(words[i:i+50:3])).encode('ascii') photo = (' '.join(words[i:i+50:7])).encode('ascii') price = Decimal(round(floats[i] * 2.182737, 4)) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) masslist.append(mass) distlist.append(dist) weightlist.append(weight) agelist.append(age) meetlist.append(meeting) misclist.append(misc) photolist.append(photo) pricelist.append(price) table.append(unicodify({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc, 'mass':mass, 'weight':weight, 'age':age, 'meeting':meeting, 'misc':misc, 'photo':photo, 'dist': dist, 'price':price})) record = table[-1] self.assertEqual(record.name.strip(), name) self.assertEqual(record.paid, paid) self.assertEqual(round(record.qty, 5), round(qty, 5)) self.assertEqual(record.orderdate, orderdate) self.assertEqual(record.desc.strip(), desc) self.assertEqual(record.mass, mass) self.assertEqual(record.dist, dist) self.assertEqual(round(record.weight, 3), round(weight, 3)) self.assertEqual(record.age, age) self.assertEqual(record.meeting, meeting) self.assertEqual(record.misc, misc) self.assertEqual(record.photo, photo) self.assertEqual(round(record.price, 4), round(price, 4)) # plus a blank record namelist.append('') paidlist.append(Unknown) qtylist.append(None) orderlist.append(NullDate) desclist.append('') masslist.append(0.0) distlist.append(0.0) weightlist.append(None) agelist.append(0) meetlist.append(NullDateTime) misclist.append(''.encode('ascii')) photolist.append(''.encode('ascii')) pricelist.append(Decimal('0.0')) table.append() table.close() table = Table(table.filename, dbf_type='vfp') table.open(mode=READ_WRITE) self.assertEqual(len(table), len(floats)+1) i = 0 for record in table[:-1]: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name.strip(), namelist[i]) self.assertEqual(record.name.strip(), namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc.strip(), desclist[i]) self.assertEqual(record.desc.strip(), desclist[i]) self.assertEqual(record.mass, masslist[i]) self.assertEqual(record.dist, distlist[i]) self.assertEqual(table[i].mass, masslist[i]) self.assertEqual(record.weight, round(weightlist[i], 3)) self.assertEqual(table[i].weight, round(weightlist[i], 3)) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(round(record.price, 4), round(pricelist[i], 4)) self.assertEqual(round(table[i].price, 4), round(pricelist[i], 4)) i += 1 record = table[-1] self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name.strip(), namelist[i]) self.assertEqual(record.name.strip(), namelist[i]) self.assertEqual(table[i].paid is None, True) self.assertEqual(record.paid is None, True) self.assertEqual(table[i].qty, None) self.assertEqual(record.qty, None) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) self.assertEqual(record.mass, masslist[i]) self.assertEqual(table[i].mass, masslist[i]) self.assertEqual(record.dist, distlist[i]) self.assertEqual(table[i].dist, distlist[i]) self.assertEqual(record.weight, weightlist[i]) self.assertEqual(table[i].weight, weightlist[i]) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(record.price, 0) self.assertEqual(table[i].price, 0) i += 1 table.close() def test_char_memo_return_type(self): "check character fields return type" table = Table(':memory:', 'text C(50); memo M', codepage='cp1252', dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) table.append(('another one bites the dust', "and another one's gone, and another one's gone...")) table.append() for record in table: self.assertTrue(type(record.text) is unicode) self.assertTrue(type(record.memo) is unicode) table = Table(':memory:', 'text C(50); memo M', codepage='cp1252', dbf_type='vfp', default_data_types=dict(C=Char, M=Char), on_disk=False) table.open(mode=READ_WRITE) table.append(('another one bites the dust', "and another one's gone, and another one's gone...")) table.append() for record in table: self.assertTrue(type(record.text) is Char) self.assertTrue(type(record.memo) is Char) table = Table(':memory:', 'text C(50); memo M', codepage='cp1252', dbf_type='vfp', default_data_types=dict(C=(Char, NoneType), M=(Char, NoneType)), on_disk=False) table.open(mode=READ_WRITE) table.append(('another one bites the dust', "and another one's gone, and another one's gone...")) table.append() record = table[0] self.assertTrue(type(record.text) is Char) self.assertTrue(type(record.memo) is Char) record = table[1] self.assertTrue(type(record.text) is NoneType) self.assertTrue(type(record.memo) is NoneType) def test_empty_is_none(self): "empty and None values" table = Table(':memory:', 'name C(20); born L; married D; appt T; wisdom M', dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) table.append() record = table[-1] self.assertTrue(record.born is None) self.assertTrue(record.married is None) self.assertTrue(record.appt is None) self.assertEqual(record.wisdom, '') appt = DateTime.now() dbf.write( record, born = True, married = Date(1992, 6, 27), appt = appt, wisdom = 'Choose Python', ) self.assertTrue(record.born) self.assertEqual(record.married, Date(1992, 6, 27)) self.assertEqual(record.appt, appt) self.assertEqual(record.wisdom, 'Choose Python') dbf.write( record, born = Unknown, married = NullDate, appt = NullDateTime, wisdom = '', ) self.assertTrue(record.born is None) self.assertTrue(record.married is None) self.assertTrue(record.appt is None) self.assertEqual(record.wisdom, '') def test_custom_data_type(self): "custom data types" table = Table( filename=':memory:', field_specs='name C(20); born L; married D; appt T; wisdom M', field_data_types=dict(name=Char, born=Logical, married=Date, appt=DateTime, wisdom=Char,), dbf_type='vfp', on_disk=False, ) table.open(mode=READ_WRITE) table.append() record = table[-1] self.assertTrue(type(record.name) is Char, "record.name is %r, not Char" % type(record.name)) self.assertTrue(type(record.born) is Logical, "record.born is %r, not Logical" % type(record.born)) self.assertTrue(type(record.married) is Date, "record.married is %r, not Date" % type(record.married)) self.assertTrue(type(record.appt) is DateTime, "record.appt is %r, not DateTime" % type(record.appt)) self.assertTrue(type(record.wisdom) is Char, "record.wisdom is %r, not Char" % type(record.wisdom)) self.assertEqual(record.name, ' ' * 20) self.assertTrue(record.born is Unknown, "record.born is %r, not Unknown" % record.born) self.assertTrue(record.married is NullDate, "record.married is %r, not NullDate" % record.married) self.assertEqual(record.married, None) self.assertTrue(record.appt is NullDateTime, "record.appt is %r, not NullDateTime" % record.appt) self.assertEqual(record.appt, None) appt = DateTime.now() dbf.write( record, name = 'Ethan ', born = True, married = Date(1992, 6, 27), appt = appt, wisdom = 'Choose Python', ) self.assertEqual(type(record.name), Char, "record.wisdom is %r, but should be Char" % record.wisdom) self.assertTrue(record.born is Truth) self.assertEqual(record.married, Date(1992, 6, 27)) self.assertEqual(record.appt, appt) self.assertEqual(type(record.wisdom), Char, "record.wisdom is %r, but should be Char" % record.wisdom) self.assertEqual(record.wisdom, 'Choose Python') dbf.write(record, born=Falsth) self.assertEqual(record.born, False) dbf.write(record, born=None, married=None, appt=None, wisdom=None) self.assertTrue(record.born is Unknown) self.assertTrue(record.married is NullDate) self.assertTrue(record.appt is NullDateTime) self.assertTrue(type(record.wisdom) is Char, "record.wisdom is %r, but should be Char" % type(record.wisdom)) def test_datatypes_param(self): "field_types with normal data type but None on empty" table = Table( filename=':memory:', field_specs='name C(20); born L; married D; wisdom M', field_data_types=dict(name=(str, NoneType), born=(bool, bool)), dbf_type='db3', on_disk=False, ) table.open(mode=READ_WRITE) table.append() record = table[-1] self.assertTrue(type(record.name) is type(None), "record.name is %r, not None" % type(record.name)) self.assertTrue(type(record.born) is bool, "record.born is %r, not bool" % type(record.born)) self.assertTrue(record.name is None) self.assertTrue(record.born is False, "record.born is %r, not False" % record.born) dbf.write(record, name='Ethan ', born=True) self.assertEqual(type(record.name), str, "record.name is %r, but should be Char" % record.wisdom) self.assertTrue(record.born is True) dbf.write(record, born=False) self.assertEqual(record.born, False) dbf.write( record, name = None, born = None, ) self.assertTrue(record.name is None) self.assertTrue(record.born is False) def test_null_type(self): "NullType" table = Table( filename=':memory:', field_specs='name C(20) NULL; born L NULL; married D NULL; appt T NULL; wisdom M NULL', default_data_types=dict( C=(Char, NoneType, NullType), L=(Logical, NoneType, NullType), D=(Date, NoneType, NullType), T=(DateTime, NoneType, NullType), M=(Char, NoneType, NullType), ), dbf_type='vfp', on_disk=False, ) table.open(mode=READ_WRITE) table.append() record = table[-1] self.assertIs(record.name, Null) self.assertIs(record.born, Null) self.assertIs(record.married, Null) self.assertIs(record.appt, Null) self.assertIs(record.wisdom, Null) appt = datetime.datetime(2012, 12, 15, 9, 37, 11) dbf.write( record, name = 'Ethan ', born = True, married = datetime.date(2001, 6, 27), appt = appt, wisdom = 'timing is everything', ) record = table[-1] self.assertEqual(record.name, u'Ethan') self.assertEqual(type(record.name), Char) self.assertTrue(record.born) self.assertTrue(record.born is Truth) self.assertEqual(record.married, datetime.date(2001, 6, 27)) self.assertEqual(type(record.married), Date) self.assertEqual(record.appt, datetime.datetime(2012, 12, 15, 9, 37, 11)) self.assertEqual(type(record.appt), DateTime) self.assertEqual(record.wisdom, u'timing is everything') self.assertEqual(type(record.wisdom), Char) dbf.write(record, name=Null, born=Null, married=Null, appt=Null, wisdom=Null) self.assertTrue(record.name is Null) self.assertTrue(record.born is Null) self.assertTrue(record.married is Null) self.assertTrue(record.appt is Null) self.assertTrue(record.wisdom is Null) dbf.write( record, name = None, born = None, married = None, appt = None, wisdom = None, ) record = table[-1] self.assertTrue(record.name is None) self.assertTrue(record.born is None) self.assertTrue(record.married is None) self.assertTrue(record.appt is None) self.assertTrue(record.wisdom is None) table = Table( filename=':memory:', field_specs='name C(20); born L; married D NULL; appt T; wisdom M; pets L; cars N(3,0) NULL; story M; died D NULL;', default_data_types=dict( C=(Char, NoneType, NullType), L=(Logical, NoneType, NullType), D=(Date, NoneType, NullType), T=(DateTime, NoneType, NullType), M=(Char, NoneType, NullType), N=(int, NoneType, NullType), ), dbf_type='vfp', on_disk=False, ) table.open(mode=READ_WRITE) table.append() record = table[-1] self.assertTrue(record.name is None) self.assertTrue(record.born is None) self.assertTrue(record.married is Null) self.assertTrue(record.appt is None) self.assertTrue(record.wisdom is None) self.assertTrue(record.pets is None) self.assertTrue(record.cars is Null) self.assertTrue(record.story is None) self.assertTrue(record.died is Null) dbf.write( record, name = 'Ethan ', born = True, married = datetime.date(2001, 6, 27), appt = appt, wisdom = 'timing is everything', pets = True, cars = 10, story = 'a poor farm boy who made good', died = datetime.date(2018, 5, 30), ) record = table[-1] self.assertEqual(record.name, 'Ethan') self.assertTrue(record.born) self.assertTrue(record.born is Truth) self.assertEqual(record.married, datetime.date(2001, 6, 27)) self.assertEqual(record.appt, datetime.datetime(2012, 12, 15, 9, 37, 11)) self.assertEqual(record.wisdom, 'timing is everything') self.assertTrue(record.pets) self.assertEqual(record.cars, 10) self.assertEqual(record.story, 'a poor farm boy who made good',) self.assertEqual(record.died, datetime.date(2018, 5, 30)) dbf.write(record, married=Null, died=Null) record = table[-1] self.assertTrue(record.married is Null) self.assertTrue(record.died is Null) def test_nonascii_text_cptrans(self): "check non-ascii text to unicode" table = Table(':memory:', 'data C(50); memo M', codepage='cp437', dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) decoder = codecs.getdecoder('cp437') if py_ver < (3, 0): high_ascii = decoder(''.join(chr(c) for c in range(128, 128+50)))[0] else: high_ascii = bytes(range(128, 128+50)).decode('cp437') table.append(dict(data=high_ascii, memo=high_ascii)) self.assertEqual(table[0].data, high_ascii) self.assertEqual(table[0].memo, high_ascii) table.close() def test_nonascii_text_no_cptrans(self): "check non-ascii text to bytes" table = Table(':memory:', 'bindata C(50) BINARY; binmemo M BINARY', codepage='cp1252', dbf_type='vfp', on_disk=False) table.open(mode=READ_WRITE) if py_ver < (3, 0): high_ascii = ''.join(chr(c) for c in range(128, 128+50)) else: high_ascii = bytes(range(128, 128+50)) table.append(dict(bindata=high_ascii, binmemo=high_ascii)) bindata = table[0].bindata binmemo = table[0].binmemo self.assertTrue(isinstance(bindata, bytes)) self.assertTrue(isinstance(binmemo, bytes)) self.assertEqual(table[0].bindata, high_ascii) self.assertEqual(table[0].binmemo, high_ascii) table.close() def test_add_null_field(self): "adding a NULL field to an existing table" table = Table( self.vfp_table.filename, 'name C(50); age N(3,0)', dbf_type='vfp', ) table.open(mode=READ_WRITE) def _50(text): return text + ' ' * (50 - len(text)) data = ( (_50('Ethan'), 29), (_50('Joseph'), 33), (_50('Michael'), 54), ) for datum in data: table.append(datum) for datum, recordnum in zip(data, table): self.assertEqual(datum, tuple(recordnum)) table.add_fields('fired D NULL') for datum, recordnum in zip(data, table): self.assertEqual(datum, tuple(recordnum)[:2]) data += ((_50('Daniel'), 44, Null), ) table.append(('Daniel', 44, Null)) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2]) table.close() table = Table(table.filename) table.open(mode=READ_WRITE) for datum, recordnum in zip(data, table): self.assertEqual(datum[0:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2]) table.close() def test_remove_null_field(self): "removing NULL fields from an existing table" table = Table( self.vfp_table.filename, 'name C(50); age N(3,0); fired D NULL', dbf_type='vfp', ) table.open(mode=READ_WRITE) def _50(text): return text + ' ' * (50 - len(text)) data = ( (_50('Ethan'), 29, Null), (_50('Joseph'), 33, Null), (_50('Michael'), 54, Date(2010, 5, 3))) for datum in data: table.append(datum) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) table.delete_fields('fired') for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)) data += ((_50('Daniel'), 44), ) table.append(('Daniel', 44)) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)) table.close() table = Table(table.filename) table.open(mode=READ_WRITE) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)) table.close() def test_add_field_to_null(self): "adding a normal field to a table with NULL fields" table = Table( self.vfp_table.filename, 'name C(50); age N(3,0); fired D NULL', dbf_type='vfp', ) table.open(mode=READ_WRITE) def _50(text): return text + ' ' * (50 - len(text)) data = ( (_50('Ethan'), 29, Null), (_50('Joseph'), 33, Null), (_50('Michael'), 54, Date(2010, 7, 4)), ) for datum in data: table.append(datum) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) table.add_fields('tenure N(3,0)') for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) data += ((_50('Daniel'), 44, Date(2005, 1, 31), 15 ), ) table.append(('Daniel', 44, Date(2005, 1, 31), 15)) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) self.assertEqual(datum[3], recordnum[3]) table.close() table = Table(table.filename) table.open(mode=READ_WRITE) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) self.assertEqual(datum[3], recordnum[3]) table.close() def test_remove_field_from_null(self): "removing a normal field from a table with NULL fields" table = Table( self.vfp_table.filename, 'name C(50); age N(3,0); fired D NULL', dbf_type='vfp', ) table.open(mode=READ_WRITE) def _50(text): return text + ' ' * (50 - len(text)) data = ( (_50('Ethan'), 29, Null), (_50('Joseph'), 33, Null), (_50('Michael'), 54, Date(2010, 7, 4)), ) for datum in data: table.append(datum) for datum, recordnum in zip(data, table): self.assertEqual(datum[:2], tuple(recordnum)[:2]) self.assertTrue(datum[2] is recordnum[2] or datum[2] == recordnum[2]) table.delete_fields('age') for datum, recordnum in zip(data, table): self.assertEqual(datum[0], recordnum[0]) self.assertTrue(datum[-1] is recordnum[1] or datum[-1] == recordnum[1]) data += ((_50('Daniel'), Date(2001, 11, 13)), ) table.append(('Daniel', Date(2001, 11, 13))) for datum, recordnum in zip(data, table): self.assertEqual(datum[0], recordnum[0]) self.assertTrue(datum[-1] is recordnum[1] or datum[-1] == recordnum[1]) table.close() table = Table(table.filename) table.open(mode=READ_WRITE) for datum, recordnum in zip(data, table): self.assertEqual(datum[0], recordnum[0]) self.assertTrue(datum[-1] is recordnum[-1] or datum[-1] == recordnum[-1], "name = %s; datum[-1] = %r; recordnum[-1] = %r" % (datum[0], datum[-1], recordnum[-1])) table.close() def test_blank_record_template_uses_null(self): nullable = self.null_vfp_table with nullable: rec = nullable[-1] self.assertTrue(rec.first is Null, "rec.first is %r" % (rec.first, )) self.assertTrue(rec.last == ' '*25, "rec.last is %r" % (rec.last, )) self.assertTrue(rec.height is Null, "rec.height is %r" % (rec.height, )) self.assertTrue(rec.age is None, "rec.age is %r" % (rec.age, )) self.assertTrue(rec.life_story is Null, "rec.life_story is %r" % (rec.life_story, )) self.assertTrue(rec.plans == '', "rec.plans is %r" % (rec.plans, )) nullable.close() nullable = Table( self.null_vfp_table.filename, default_data_types='enhanced', ) with nullable: rec = nullable[-1] self.assertTrue(rec.first is Null, "rec.first is %r" % (rec.first, )) self.assertTrue(rec.last == '', "rec.last is %r" % (rec.last, )) self.assertTrue(rec.height is Null, "rec.height is %r" % (rec.height, )) self.assertTrue(rec.age is None, "rec.age is %r" % (rec.age, )) self.assertTrue(rec.life_story is Null, "rec.life_story is %r" % (rec.life_story, )) self.assertTrue(rec.plans == '', "rec.plans is %r" % (rec.plans, )) nullable.close() nullable = Table( self.null_vfp_table.filename, default_data_types=dict( C=(Char, NoneType, NullType), L=(Logical, NoneType, NullType), D=(Date, NoneType, NullType), T=(DateTime, NoneType, NullType), M=(Char, NoneType, NullType), ), ) with nullable: rec = nullable[-1] self.assertTrue(rec.first is Null, "rec.first is %r" % (rec.first, )) self.assertTrue(rec.last is None, "rec.last is %r" % (rec.last, )) self.assertTrue(rec.height is Null, "rec.height is %r" % (rec.height, )) self.assertTrue(rec.age is None, "rec.age is %r" % (rec.age, )) self.assertTrue(rec.life_story is Null, "rec.life_story is %r" % (rec.life_story, )) self.assertTrue(rec.plans is None, "rec.plans is %r" % (rec.plans, )) def test_new_record_with_partial_fields_respects_null(self): nullable = self.null_vfp_table nullable.close() nullable = Table( self.null_vfp_table.filename, default_data_types=dict( C=(Char, NoneType, NullType), L=(Logical, NoneType, NullType), D=(Date, NoneType, NullType), T=(DateTime, NoneType, NullType), M=(Char, NoneType, NullType), ), ) with nullable: nullable.append({'first': 'ethan', 'last':'doe'}) rec = nullable[-1] self.assertTrue(rec.first == 'ethan', "rec.first is %r" % (rec.first, )) self.assertTrue(rec.last == 'doe', "rec.last is %r" % (rec.last, )) self.assertTrue(rec.height is Null, "rec.height is %r" % (rec.height, )) self.assertTrue(rec.age is None, "rec.age is %r" % (rec.age, )) self.assertTrue(rec.life_story is Null, "rec.life_story is %r" % (rec.life_story, )) self.assertTrue(rec.plans is None, "rec.plans is %r" % (rec.plans, )) nullable.close() def test_flux_internal(self): "commit and rollback of flux record (implementation detail)" table = self.dbf_table table.open(mode=READ_WRITE) table.append(('dbf master', True, 77, Date(2012, 5, 20), "guru of some things dbf-y")) record = table[-1] old_data = dbf.scatter(record) record._start_flux() record.name = 'novice' record.paid = False record.qty = 69 record.orderdate = Date(2011, 1, 1) record.desc = 'master of all he surveys' try: self.assertEqual( dbf.scatter(record), dict( name=unicode('novice '), paid=False, qty=69, orderdate=datetime.date(2011, 1, 1), desc='master of all he surveys', )) finally: record._rollback_flux() self.assertEqual(old_data, dbf.scatter(record)) record._start_flux() record.name = 'novice' record.paid = False record.qty = 69 record.orderdate = Date(2011, 1, 1) record._commit_flux() self.assertEqual( dbf.scatter(record), dict( name=unicode('novice '), paid=False, qty=69, orderdate=datetime.date(2011, 1, 1), desc='guru of some things dbf-y', )) self.assertNotEqual(old_data, dbf.scatter(record)) def test_field_capitalization(self): "ensure mixed- and upper-case field names work" table = dbf.Table('mixed', 'NAME C(30); Age N(5,2)', on_disk=False) self.assertEqual(['NAME', 'AGE'], field_names(table)) table.open(dbf.READ_WRITE) table.append({'Name':'Ethan', 'AGE': 99}) rec = table[0] self.assertEqual(rec.NaMe.strip(), 'Ethan') table.rename_field('NaMe', 'My_NAME') self.assertEqual(rec.My_NaMe.strip(), 'Ethan') self.assertEqual(['MY_NAME', 'AGE'], field_names(table)) table.append({'MY_Name':'Allen', 'AGE': 7}) rec = table[1] self.assertEqual(rec.my_NaMe.strip(), 'Allen') class TestDbfRecordTemplates(TestCase): "Testing records" def setUp(self): self.dbf_table = Table( os.path.join(tempdir, 'dbf_table'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.vfp_table = Table( os.path.join(tempdir, 'vfp_table'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' + ' weight F(18,3); age I; meeting T; misc G; photo P; price Y', dbf_type='vfp', ) def tearDown(self): self.dbf_table.close() self.vfp_table.close() def test_dbf_storage(self): table = self.dbf_table table.open(mode=READ_WRITE) record = table.create_template() record.name = 'Stoneleaf' record.paid = True record.qty = 1 record.orderdate = Date.today() record.desc = 'some Python dude' table.append(record) def test_vfp_storage(self): table = self.vfp_table table.open(mode=READ_WRITE) record = table.create_template() record.name = 'Stoneleaf' record.paid = True record.qty = 1 record.orderdate = Date.today() record.desc = 'some Python dude' record.mass = 251.9287 record.weight = 971204.39 record.age = 29 record.meeting = DateTime.now() record.misc = MISC record.photo = PHOTO record.price = 19.99 table.append(record) class TestDbfFunctions(TestCase): def setUp(self): "create a dbf and vfp table" self.empty_dbf_table = Table( os.path.join(tempdir, 'emptytemptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) self.dbf_table = table = Table( os.path.join(tempdir, 'temptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) table.open(mode=READ_WRITE) namelist = self.dbf_namelist = [] paidlist = self.dbf_paidlist = [] qtylist = self.dbf_qtylist = [] orderlist = self.dbf_orderlist = [] desclist = self.dbf_desclist = [] for i in range(len(floats)): name = '%-25s' % words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc}) table.close() self.empty_vfp_table = Table( os.path.join(tempdir, 'emptytempvfp'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' ' weight F(18,3); age I; meeting T; misc G; photo P; price Y;' ' dist B BINARY; atom I BINARY; wealth Y BINARY;' , dbf_type='vfp', ) self.odd_memo_vfp_table = Table( os.path.join(tempdir, 'emptytempvfp'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' ' weight F(18,3); age I; meeting T; misc G; photo P; price Y;' ' dist B BINARY; atom I BINARY; wealth Y BINARY;' , dbf_type='vfp', memo_size=48, ) self.vfp_table = table = Table( os.path.join(tempdir, 'tempvfp'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' ' weight F(18,3); age I; meeting T; misc G; photo P; price Y;' ' dist B BINARY; atom I BINARY; wealth Y BINARY;' , dbf_type='vfp', ) table.open(mode=READ_WRITE) namelist = self.vfp_namelist = [] paidlist = self.vfp_paidlist = [] qtylist = self.vfp_qtylist = [] orderlist = self.vfp_orderlist = [] desclist = self.vfp_desclist = [] masslist = self.vfp_masslist = [] weightlist = self.vfp_weightlist = [] agelist = self.vfp_agelist = [] meetlist = self.vfp_meetlist = [] misclist = self.vfp_misclist = [] photolist = self.vfp_photolist = [] pricelist = self.vfp_pricelist = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] price = Decimal(round(floats[i] * 2.182737, 4)) orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) mass = floats[i] * floats[i] / 2.0 weight = round(floats[i] * 3, 3) age = numbers[i] meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, \ (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) misc = ' '.join(words[i:i+50:3]).encode('latin1') photo = ' '.join(words[i:i+50:7]).encode('latin1') namelist.append('%-25s' % name) paidlist.append(paid) qtylist.append(qty) pricelist.append(price) orderlist.append(orderdate) desclist.append(desc) masslist.append(mass) weightlist.append(weight) agelist.append(age) meetlist.append(meeting) misclist.append(misc) photolist.append(photo) meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc, 'mass':mass, 'weight':weight, 'age':age, 'meeting':meeting, 'misc':misc, 'photo':photo, 'price':price, 'dist':mass, 'atom':age, 'wealth':price}) table.close() def tearDown(self): self.dbf_table.close() self.vfp_table.close() def test_add_fields_to_dbf_table(self): "dbf table: adding and deleting fields" table = self.dbf_table table.open(mode=READ_WRITE) dbf._debug = True namelist = self.dbf_namelist paidlist = self.dbf_paidlist qtylist = self.dbf_qtylist orderlist = self.dbf_orderlist desclist = self.dbf_desclist table.delete_fields('name') table.close() table = Table(table.filename, dbf_type='db3') table.open(mode=READ_WRITE) for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) i += 1 first, middle, last = table[0], table[len(table)//2], table[-1] table.delete_fields('paid, orderdate') for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) i += 1 self.assertEqual(i, len(table)) self.assertTrue('paid' not in dbf.field_names(first)) self.assertTrue('orderdate' not in dbf.field_names(middle)) self.assertTrue('name' not in dbf.field_names(last)) table.add_fields('name C(25); paid L; orderdate D') for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) self.assertEqual(i, len(table)) i = 0 for i, record in enumerate(table): self.assertEqual(record.name, ' ' * 25) self.assertEqual(record.paid, None) self.assertEqual(record.orderdate, None) self.assertEqual(record.desc, desclist[i]) i += 1 self.assertEqual(i, len(table)) i = 0 for record in table: data = dict() data['name'] = namelist[dbf.recno(record)] data['paid'] = paidlist[dbf.recno(record)] data['orderdate'] = orderlist[dbf.recno(record)] dbf.gather(record, data) i += 1 self.assertEqual(i, len(table)) i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name, namelist[i]) self.assertEqual(record.name, namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) i += 1 table.close() def test_add_fields_to_vfp_table(self): "vfp table: adding and deleting fields" table = self.vfp_table table.open(mode=READ_WRITE) namelist = self.vfp_namelist paidlist = self.vfp_paidlist qtylist = self.vfp_qtylist orderlist = self.vfp_orderlist desclist = self.vfp_desclist masslist = self.vfp_masslist weightlist = self.vfp_weightlist agelist = self.vfp_agelist meetlist = self.vfp_meetlist misclist = self.vfp_misclist photolist = self.vfp_photolist pricelist = self.vfp_pricelist self.assertEqual(len(table), len(floats)) i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name, namelist[i]) self.assertEqual(record.name, namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertTrue(abs(table[i].qty - qtylist[i]) < .00001) self.assertTrue(abs(record.qty - qtylist[i]) < .00001) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) self.assertEqual(record.mass, masslist[i]) self.assertEqual(table[i].mass, masslist[i]) self.assertEqual(record.dist, masslist[i]) self.assertEqual(table[i].dist, masslist[i]) self.assertEqual(record.weight, weightlist[i]) self.assertEqual(table[i].weight, weightlist[i]) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.atom, agelist[i]) self.assertEqual(table[i].atom, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(round(record.price, 4), round(pricelist[i], 4)) self.assertEqual(round(table[i].price, 4), round(pricelist[i], 4)) self.assertTrue(round(record.wealth, 4), round(pricelist[i], 4)) self.assertTrue(round(table[i].wealth, 4), round(pricelist[i], 4)) i += 1 table.delete_fields('desc') i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name, namelist[i]) self.assertEqual(record.name, namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(record.weight, weightlist[i]) self.assertEqual(table[i].weight, weightlist[i]) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.atom, agelist[i]) self.assertEqual(table[i].atom, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(record.mass, masslist[i]) self.assertEqual(table[i].mass, masslist[i]) self.assertEqual(record.dist, masslist[i]) self.assertEqual(table[i].dist, masslist[i]) self.assertEqual(round(record.price, 4), round(pricelist[i], 4)) self.assertEqual(round(table[i].price, 4), round(pricelist[i], 4)) self.assertTrue(round(record.wealth, 4), round(pricelist[i], 4)) self.assertTrue(round(table[i].wealth, 4), round(pricelist[i], 4)) i += 1 table.delete_fields('paid, mass') i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name, namelist[i]) self.assertEqual(record.name, namelist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(record.weight, weightlist[i]) self.assertEqual(table[i].weight, weightlist[i]) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.atom, agelist[i]) self.assertEqual(table[i].atom, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(record.dist, masslist[i]) self.assertEqual(table[i].dist, masslist[i]) self.assertEqual(round(record.price, 4), round(pricelist[i], 4)) self.assertEqual(round(table[i].price, 4), round(pricelist[i], 4)) self.assertTrue(round(record.wealth, 4), round(pricelist[i], 4)) self.assertTrue(round(table[i].wealth, 4), round(pricelist[i], 4)) i += 1 table.add_fields('desc M; paid L; mass B') i = 0 for record in table: self.assertEqual(record.desc, unicode('')) self.assertEqual(record.paid is None, True) self.assertEqual(record.mass, 0.0) i += 1 self.assertEqual(i, len(table)) i = 0 for record in Process(table): record.desc = desclist[dbf.recno(record)] record.paid = paidlist[dbf.recno(record)] record.mass = masslist[dbf.recno(record)] i += 1 self.assertEqual(i, len(table)) i = 0 for record in table: self.assertEqual(dbf.recno(record), i) self.assertEqual(table[i].name, namelist[i]) self.assertEqual(record.name, namelist[i]) self.assertEqual(table[i].paid, paidlist[i]) self.assertEqual(record.paid, paidlist[i]) self.assertEqual(abs(table[i].qty - qtylist[i]) < .00001, True) self.assertEqual(abs(record.qty - qtylist[i]) < .00001, True) self.assertEqual(table[i].orderdate, orderlist[i]) self.assertEqual(record.orderdate, orderlist[i]) self.assertEqual(table[i].desc, desclist[i]) self.assertEqual(record.desc, desclist[i]) self.assertEqual(record.mass, masslist[i]) self.assertEqual(table[i].mass, masslist[i]) self.assertEqual(record.dist, masslist[i]) self.assertEqual(table[i].dist, masslist[i]) self.assertEqual(record.weight, weightlist[i]) self.assertEqual(table[i].weight, weightlist[i]) self.assertEqual(record.age, agelist[i]) self.assertEqual(table[i].age, agelist[i]) self.assertEqual(record.atom, agelist[i]) self.assertEqual(table[i].atom, agelist[i]) self.assertEqual(record.meeting, meetlist[i]) self.assertEqual(table[i].meeting, meetlist[i]) self.assertEqual(record.misc, misclist[i]) self.assertEqual(table[i].misc, misclist[i]) self.assertEqual(record.photo, photolist[i]) self.assertEqual(table[i].photo, photolist[i]) self.assertEqual(round(record.price, 4), round(pricelist[i], 4)) self.assertEqual(round(table[i].price, 4), round(pricelist[i], 4)) self.assertTrue(round(record.wealth, 4), round(pricelist[i], 4)) self.assertTrue(round(table[i].wealth, 4), round(pricelist[i], 4)) i += 1 table.close() def test_len_contains_iter(self): "basic function tests - len, contains & iterators" table = self.dbf_table.open() for field in table.field_names: self.assertEqual(1, table.field_names.count(field)) length = sum([1 for rec in table]) self.assertEqual(length, len(table)) i = 0 for record in table: self.assertEqual(record, table[i]) self.assertTrue(record in table) self.assertTrue(tuple(record) in table) self.assertTrue(scatter(record) in table) self.assertTrue(create_template(record) in table) i += 1 self.assertEqual(i, len(table)) table.close() def test_undelete(self): "delete, undelete" table = Table(':memory:', 'name C(10)', dbf_type='db3', on_disk=False) table.open(mode=READ_WRITE) table.append() self.assertEqual(table.next_record, table[0]) table = Table(':memory:', 'name C(10)', dbf_type='db3', on_disk=False) table.open(mode=READ_WRITE) table.append(multiple=10) self.assertEqual(table.next_record, table[0]) table = self.dbf_table # Table(os.path.join(tempdir, 'temptable'), dbf_type='db3') table.open(mode=READ_WRITE) total = len(table) table.bottom() self.assertEqual(dbf.recno(table.current_record), total) table.top() self.assertEqual(dbf.recno(table.current_record), -1) table.goto(27) self.assertEqual(dbf.recno(table.current_record), 27) table.goto(total-1) self.assertEqual(dbf.recno(table.current_record), total-1) table.goto(0) self.assertEqual(dbf.recno(table.current_record), 0) self.assertRaises(IndexError, table.goto, total) self.assertRaises(IndexError, table.goto, -len(table)-1) table.top() self.assertRaises(dbf.Bof, table.skip, -1) table.bottom() self.assertRaises(Eof, table.skip) for record in table: dbf.delete(record) active_records = table.create_index(active) active_records.top() self.assertRaises(Eof, active_records.skip) dbf._debug = True active_records.bottom() self.assertRaises(Bof, active_records.skip, -1) for record in table: dbf.undelete(record) # delete every third record i = 0 for record in table: self.assertEqual(dbf.recno(record), i) if i % 3 == 0: dbf.delete(record) i += 1 i = 0 # and verify for record in table: self.assertEqual(dbf.is_deleted(record), i%3==0) self.assertEqual(dbf.is_deleted(table[i]), i%3==0) i += 1 # check that deletes were saved to disk.. table.close() table = Table(os.path.join(tempdir, 'temptable'), dbf_type='db3') table.open(mode=READ_WRITE) active_records = table.create_index(active) i = 0 for record in table: self.assertEqual(dbf.is_deleted(record), i%3==0) self.assertEqual(dbf.is_deleted(table[i]), i%3==0) i += 1 # verify record numbers i = 0 for record in table: self.assertEqual(dbf.recno(record), i) i += 1 # verify that deleted records are skipped i = 0 for record in active_records: self.assertNotEqual(dbf.recno(record)%3, 0) active_records.goto(1) active_records.skip() self.assertEqual(dbf.recno(active_records.current_record), 4) active_records.skip(-1) self.assertEqual(dbf.recno(active_records.current_record), 2) # verify that deleted records are skipped in slices list_of_records = active_records[3:6] self.assertEqual(len(list_of_records), 3) self.assertEqual(dbf.recno(list_of_records[0]), 5) self.assertEqual(dbf.recno(list_of_records[1]), 7) self.assertEqual(dbf.recno(list_of_records[2]), 8) # verify behavior when all records are deleted for record in table: dbf.delete(record) active_records.bottom() self.assertRaises(Eof, active_records.skip) self.assertEqual(active_records.eof, True) active_records.top() self.assertRaises(Bof, active_records.skip, -1) self.assertEqual(active_records.bof, True) # verify deleted records are seen with active record index deleted_records = table.create_index(inactive) i = 0 for record in deleted_records: self.assertEqual(dbf.recno(record), i) i += 1 # verify undelete using table[index] for record in table: dbf.delete(record) self.assertTrue(dbf.is_deleted(record)) for i, record in enumerate(table): dbf.undelete(table[i]) self.assertEqual(dbf.is_deleted(record), False) self.assertEqual(dbf.is_deleted(table[i]), False) self.assertFalse(record in deleted_records) # verify all records have been undeleted (recalled) self.assertEqual(len(active_records), len(table)) self.assertEqual(len(deleted_records), 0) table.close() def test_finding_ordering_searching(self): "finding, ordering, searching" table = self.dbf_table table.open(mode=READ_WRITE) # find (brute force) unordered = [] for record in table: unordered.append(record.name) for word in unordered: # returns records # records = table.query("select * where name == %r" % word) # self.assertEqual(len(records), unordered.count(word)) records = [rec for rec in table if rec.name == word] self.assertEqual(len(records), unordered.count(word)) # ordering by one field ordered = unordered[:] ordered.sort() name_index = table.create_index(lambda rec: rec.name) self.assertEqual(list(name_index[::-1]), list(reversed(name_index))) i = 0 for record in name_index: self.assertEqual(record.name, ordered[i]) i += 1 # search (BINARY) for word in unordered: records = name_index.search(match=word) self.assertEqual(len(records), unordered.count(word), "num records: %d\nnum words: %d\nfailure with %r" % (len(records), unordered.count(word), word)) records = table.query("select * where name == %r" % word) self.assertEqual(len(records), unordered.count(word)) records = dbf.pqlc(table, "select * where name == %r" % word) self.assertEqual(len(records), unordered.count(word)) # ordering by two fields ordered = unordered[:] ordered.sort() nd_index = table.create_index(lambda rec: (rec.name, rec.desc)) self.assertEqual(list(nd_index[::-1]), list(reversed(nd_index))) i = 0 for record in nd_index: self.assertEqual(record.name, ordered[i]) i += 1 # search (BINARY) for word in unordered: records = nd_index.search(match=(word, ), partial=True) ucount = sum([1 for wrd in unordered if wrd.startswith(word)]) self.assertEqual(len(records), ucount) # partial search rec = nd_index[7] self.assertTrue(nd_index.search((rec.name, rec.desc[:4]), partial=True)) for record in table[::2]: dbf.write(record, qty=-record.qty) unordered = [] for record in table: unordered.append(record.qty) ordered = unordered[:] ordered.sort() qty_index = table.create_index(lambda rec: rec.qty) self.assertEqual(list(qty_index[::-1]), list(reversed(qty_index))) i = 0 for record in qty_index: self.assertEqual(record.qty, ordered[i]) i += 1 for number in unordered: records = qty_index.search(match=(number, )) self.assertEqual(len(records), unordered.count(number)) table.close() def test_scatter_gather_new(self): "scattering and gathering fields, and new()" table = self.dbf_table table.open(mode=READ_WRITE) table2 = table.new(os.path.join(tempdir, 'temptable2')) table2.open(mode=READ_WRITE) for record in table: table2.append() newrecord = table2[-1] testdict = dbf.scatter(record) for key in field_names(testdict): self.assertEqual(testdict[key], record[key]) dbf.gather(newrecord, dbf.scatter(record)) for field in dbf.field_names(record): self.assertEqual(newrecord[field], record[field]) table2.close() table2 = None table2 = Table(os.path.join(tempdir, 'temptable2'), dbf_type='db3') table2.open(mode=READ_WRITE) for i in range(len(table)): temp1 = dbf.scatter(table[i]) temp2 = dbf.scatter(table2[i]) for key in field_names(temp1): self.assertEqual(temp1[key], temp2[key]) for key in field_names(temp2): self.assertEqual(temp1[key], temp2[key]) table2.close() table3 = table.new(':memory:', on_disk=False) table3.open(mode=READ_WRITE) for record in table: table3.append(record) table4 = self.vfp_table table4.open(mode=READ_WRITE) table5 = table4.new(':memory:', on_disk=False) table5.open(mode=READ_WRITE) for record in table4: table5.append(record) table.close() table3.close() table4.close() table5.close() def test_rename_contains_has_key(self): "renaming fields, __contains__, has_key" table = self.dbf_table table.open(mode=READ_WRITE) for field in table.field_names: oldfield = field table.rename_field(oldfield, 'newfield') self.assertEqual(oldfield in table.field_names, False) self.assertEqual('newfield' in table.field_names, True) table.close() table = Table(os.path.join(tempdir, 'temptable'), dbf_type='db3') table.open(mode=READ_WRITE) self.assertEqual(oldfield in table.field_names, False) self.assertEqual('newfield' in table.field_names, True) table.rename_field('newfield', oldfield) self.assertEqual(oldfield in table.field_names, True) self.assertEqual('newfield' in table.field_names, False) table.close() def test_dbf_record_kamikaze(self): "kamikaze" table = self.dbf_table table.open(mode=READ_WRITE) table2 = table.new(os.path.join(tempdir, 'temptable2')) table2.open(mode=READ_WRITE) for record in table: table2.append(record) newrecord = table2[-1] for key in table.field_names: if key not in table.memo_types: self.assertEqual(newrecord[key], record[key]) for field in dbf.field_names(newrecord): if key not in table2.memo_types: self.assertEqual(newrecord[field], record[field]) table2.close() table2 = Table(os.path.join(tempdir, 'temptable2'), dbf_type='db3') table2.open(mode=READ_WRITE) for i in range(len(table)): dict1 = dbf.scatter(table[i], as_type=dict) dict2 = dbf.scatter(table2[i], as_type=dict) for key in dict1.keys(): if key not in table.memo_types: self.assertEqual(dict1[key], dict2[key]) for key in dict2.keys(): if key not in table2.memo_types: self.assertEqual(dict1[key], dict2[key]) for i in range(len(table)): template1 = dbf.scatter(table[i]) template2 = dbf.scatter(table2[i]) for key in dbf.field_names(template1): if key not in table.memo_types: self.assertEqual(template1[key], template2[key]) for key in dbf.field_names(template2): if key not in table2.memo_types: self.assertEqual(template1[key], template2[key]) table.close() table2.close() def test_multiple_append(self): "multiple append" table = self.dbf_table table.open(mode=READ_WRITE) table2 = table.new(os.path.join(tempdir, 'temptable2')) table2.open(mode=READ_WRITE) record = table.next_record table2.append(dbf.scatter(record), multiple=100) for samerecord in table2: for field in dbf.field_names(record): self.assertEqual(record[field], samerecord[field]) table2.close() table2 = Table(os.path.join(tempdir, 'temptable2'), dbf_type='db3') table2.open(mode=READ_WRITE) for samerecord in table2: for field in dbf.field_names(record): self.assertEqual(record[field], samerecord[field]) table2.close() table3 = table.new(os.path.join(tempdir, 'temptable3')) table3.open(mode=READ_WRITE) record = table.next_record table3.append(record, multiple=100) for samerecord in table3: for field in dbf.field_names(record): self.assertEqual(record[field], samerecord[field]) table3.close() table3 = Table(os.path.join(tempdir, 'temptable3'), dbf_type='db3') table3.open(mode=READ_WRITE) for samerecord in table3: for field in dbf.field_names(record): self.assertEqual(record[field], samerecord[field]) table3.close() table.close() def test_slices(self): "slices" table = self.dbf_table table.open(mode=READ_WRITE) slice1 = [table[0], table[1], table[2]] self.assertEqual(slice1, list(table[:3])) slice2 = [table[-3], table[-2], table[-1]] self.assertEqual(slice2, list(table[-3:])) slice3 = [record for record in table] self.assertEqual(slice3, list(table[:])) slice4 = [table[9]] self.assertEqual(slice4, list(table[9:10])) slice5 = [table[15], table[16], table[17], table[18]] self.assertEqual(slice5, list(table[15:19])) slice6 = [table[0], table[2], table[4], table[6], table[8]] self.assertEqual(slice6, list(table[:9:2])) slice7 = [table[-1], table[-2], table[-3]] self.assertEqual(slice7, list(table[-1:-4:-1])) table.close() def test_record_reset(self): "reset record" table = self.dbf_table table.open(mode=READ_WRITE) for record in table: with record: self.assertTrue(record.qty) dbf.reset(record, keep_fields=['name']) self.assertFalse(record.qty) self.assertTrue(record.name) for record in table: dbf.reset(record) self.assertEqual(table[0].name, table[1].name) dbf.write(table[0], name='Python rocks!') self.assertNotEqual(table[0].name, table[1].name) table.close() def test_adding_memos(self): "adding memos to existing records" table = Table(':memory:', 'name C(50); age N(3,0)', dbf_type='db3', on_disk=False) table.open(mode=READ_WRITE) table.append(('user', 0)) table.add_fields('motto M') dbf.write(table[0], motto='Are we there yet??') self.assertEqual(table[0].motto, 'Are we there yet??') table.close() table = Table(os.path.join(tempdir, 'temptable4'), 'name C(50); age N(3,0)', dbf_type='db3') table.open(mode=READ_WRITE) table.append(('user', 0)) table.close() table.open(mode=READ_WRITE) table.close() table = Table(os.path.join(tempdir, 'temptable4'), dbf_type='db3') table.open(mode=READ_WRITE) table.add_fields('motto M') dbf.write(table[0], motto='Are we there yet??') self.assertEqual(table[0].motto, 'Are we there yet??') table.close() table = Table(os.path.join(tempdir, 'temptable4'), dbf_type='db3') table.open(mode=READ_WRITE) self.assertEqual(table[0].motto, 'Are we there yet??') table.close() table = Table(os.path.join(tempdir, 'temptable4'), 'name C(50); age N(3,0)', dbf_type='vfp') table.open(mode=READ_WRITE) table.append(('user', 0)) table.close() table.open(mode=READ_WRITE) table.close() table = Table(os.path.join(tempdir, 'temptable4'), dbf_type='vfp') table.open(mode=READ_WRITE) table.add_fields('motto M') dbf.write(table[0], motto='Are we there yet??') self.assertEqual(table[0].motto, 'Are we there yet??') table.close() table = Table(os.path.join(tempdir, 'temptable4'), dbf_type='vfp') table.open(mode=READ_WRITE) self.assertEqual(table[0].motto, 'Are we there yet??') table.close() def test_from_csv(self): "from_csv" table = self.dbf_table table.open(mode=READ_WRITE) dbf.export(table, table.filename, header=False) csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]), csvtable[i][j]) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), to_disk=True, filename=os.path.join(tempdir, 'temptable5')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]).strip(), csvtable[i][j].strip()) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), field_names=['field1','field2']) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]), csvtable[i][j]) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), field_names=['field1','field2'], to_disk=True, filename=os.path.join(tempdir, 'temptable5')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]).strip(), csvtable[i][j].strip()) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), extra_fields=['count N(5,0)','id C(10)']) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]), csvtable[i][j]) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), extra_fields=['count N(5,0)','id C(10)'], to_disk=True, filename=os.path.join(tempdir, 'temptable5')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]).strip(), csvtable[i][j].strip()) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), field_names=['name','qty','paid','desc'], extra_fields='test1 C(15);test2 L'.split(';')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]), csvtable[i][j]) csvtable.close() csvtable = dbf.from_csv(os.path.join(tempdir, 'temptable.csv'), field_names=['name','qty','paid','desc'], extra_fields='test1 C(15);test2 L'.split(';'), to_disk=True, filename=os.path.join(tempdir, 'temptable5')) csvtable.open(mode=READ_WRITE) for i in index(table): for j in index(table.field_names): self.assertEqual(str(table[i][j]).strip(), csvtable[i][j].strip()) csvtable.close() def test_resize_empty(self): "resize" table = self.empty_dbf_table table.open(mode=READ_WRITE) table.resize_field('name', 40) table.close() def test_resize(self): "resize" table = self.dbf_table table.open(mode=READ_WRITE) test_record = dbf.scatter(table[5]) test_record = dbf.scatter(table[5]) table.resize_field('name', 40) new_record = dbf.scatter(table[5]) self.assertEqual(test_record['orderdate'], new_record['orderdate']) table.close() def test_memos_after_close(self): "memos available after close/open" table = dbf.Table('tempy', 'name C(20); desc M', dbf_type='db3', default_data_types=dict(C=Char)) table.open(mode=READ_WRITE) table.append(('Author','dashing, debonair, delightful')) table.close() table.open(mode=READ_WRITE) self.assertEqual(tuple(table[0]), ('Author','dashing, debonair, delightful')) table.close() table2 = dbf.Table('tempy', 'name C(20); desc M', dbf_type='db3') table2.open(mode=READ_WRITE) table2.append(('Benedict', 'brilliant, bombastic, bothered')) table2.close() table.open(mode=READ_WRITE) self.assertEqual(table[0].name, 'Benedict') self.assertEqual(table[0].desc, 'brilliant, bombastic, bothered') table.close() def test_field_type(self): "table.type(field) == ('C', Char)" table = dbf.Table('tempy', 'name C(20); desc M', dbf_type='db3', default_data_types=dict(C=Char)) table.open(mode=READ_WRITE) field_info = table.field_info('name') self.assertEqual(field_info, (FieldType.CHAR, 20, 0, Char)) self.assertEqual(field_info.field_type, FieldType.CHAR) self.assertEqual(field_info.length, 20) self.assertEqual(field_info.decimal, 0) self.assertEqual(field_info.py_type, Char) table.close() def test_memo_after_backup(self): "memo fields accessible after .backup()" table = self.dbf_table table.open(mode=READ_WRITE) table.create_backup() backup = dbf.Table(table.backup) backup.open(mode=READ_WRITE) desclist = self.dbf_desclist for i in range(len(desclist)): self.assertEqual(desclist[i], backup[i].desc) backup.close() table.close() def test_memo_file_size_before_backup(self): table = self.odd_memo_vfp_table self.assertEqual(48, table._meta.memo_size) def test_memo_file_size_after_backup(self): table = self.odd_memo_vfp_table table.open(mode=READ_ONLY) table.create_backup() table.close() backup = dbf.Table(table.backup) self.assertEqual(backup._meta.memo_size, table._meta.memo_size) def test_write_loop(self): "Process loop commits changes" table = self.dbf_table table.open(mode=READ_WRITE) for record in Process(table): record.name = '!BRAND NEW NAME!' for record in table: self.assertEqual(record.name, '!BRAND NEW NAME! ') table.close() def test_export_headers(self): for table in self.dbf_table, self.vfp_table: table.open(mode=READ_WRITE) dest = os.path.join(tempdir, 'test_export.csv') dbf.export(table, filename=dest) with open(dest) as fh: headers = fh.readline() self.assertEqual(headers.strip(), ','.join(table.field_names)) def test_index_search(self): table = Table("unordered", "icao C(20)", default_data_types=dict(C=Char), on_disk=False).open(mode=READ_WRITE) icao = ("kilo charlie echo golf papa hotel delta tango india sierra juliet lima zulu mike " "bravo november alpha oscar quebec romeo uniform victor whiskey x-ray yankee foxtrot".split()) for alpha in icao: table.append((alpha,)) sorted = table.create_index(lambda rec: rec.icao) self.assertTrue(sorted.index_search('alpha')) self.assertTrue(sorted.index_search('bravo')) self.assertTrue(sorted.index_search('charlie')) self.assertTrue(sorted.index_search('delta')) self.assertTrue(sorted.index_search('echo')) self.assertTrue(sorted.index_search('foxtrot')) self.assertTrue(sorted.index_search('golf')) self.assertTrue(sorted.index_search('hotel')) self.assertTrue(sorted.index_search('india')) self.assertTrue(sorted.index_search('juliet')) self.assertTrue(sorted.index_search('kilo')) self.assertTrue(sorted.index_search('lima')) self.assertTrue(sorted.index_search('mike')) self.assertTrue(sorted.index_search('november')) self.assertTrue(sorted.index_search('oscar')) self.assertTrue(sorted.index_search('papa')) self.assertTrue(sorted.index_search('quebec')) self.assertTrue(sorted.index_search('romeo')) self.assertTrue(sorted.index_search('sierra')) self.assertTrue(sorted.index_search('tango')) self.assertTrue(sorted.index_search('uniform')) self.assertTrue(sorted.index_search('victor')) self.assertTrue(sorted.index_search('whiskey')) self.assertTrue(sorted.index_search('x-ray')) self.assertTrue(sorted.index_search('yankee')) self.assertTrue(sorted.index_search('zulu')) self.assertEqual(sorted.index_search('alpha'), 0) self.assertEqual(sorted.index_search('bravo'), 1) self.assertEqual(sorted.index_search('charlie'), 2) self.assertEqual(sorted.index_search('delta'), 3) self.assertEqual(sorted.index_search('echo'), 4) self.assertEqual(sorted.index_search('foxtrot'), 5) self.assertEqual(sorted.index_search('golf'), 6) self.assertEqual(sorted.index_search('hotel'), 7) self.assertEqual(sorted.index_search('india'), 8) self.assertEqual(sorted.index_search('juliet'), 9) self.assertEqual(sorted.index_search('kilo'), 10) self.assertEqual(sorted.index_search('lima'), 11) self.assertEqual(sorted.index_search('mike'), 12) self.assertEqual(sorted.index_search('november'), 13) self.assertEqual(sorted.index_search('oscar'), 14) self.assertEqual(sorted.index_search('papa'), 15) self.assertEqual(sorted.index_search('quebec'), 16) self.assertEqual(sorted.index_search('romeo'), 17) self.assertEqual(sorted.index_search('sierra'), 18) self.assertEqual(sorted.index_search('tango'), 19) self.assertEqual(sorted.index_search('uniform'), 20) self.assertEqual(sorted.index_search('victor'), 21) self.assertEqual(sorted.index_search('whiskey'), 22) self.assertEqual(sorted.index_search('x-ray'), 23) self.assertEqual(sorted.index_search('yankee'), 24) self.assertEqual(sorted.index_search('zulu'), 25) self.assertRaises(NotFoundError, sorted.index_search, 'john') self.assertRaises(NotFoundError, sorted.index_search, 'john', partial=True) self.assertEqual(sorted.index_search('able', nearest=True), 0) self.assertFalse(sorted.index_search('able', nearest=True)) self.assertEqual(sorted.index_search('alp', partial=True), 0) self.assertTrue(sorted.index_search('alp', partial=True)) self.assertEqual(sorted.index_search('john', nearest=True), 9) self.assertFalse(sorted.index_search('john', nearest=True)) self.assertEqual(sorted.index_search('jul', partial=True), 9) self.assertTrue(sorted.index_search('jul', partial=True)) def test_mismatched_extensions(self): old_memo_name = self.dbf_table._meta.memoname new_memo_name = old_memo_name[:-3] + 'Dbt' os.rename(old_memo_name, new_memo_name) table = Table(self.dbf_table._meta.filename) self.assertEqual(table._meta.memoname, new_memo_name) with table: for rec, desc in zip(table, self.dbf_desclist): self.assertEqual(rec.desc, desc) # old_memo_name = self.vfp_table._meta.memoname new_memo_name = old_memo_name[:-3] + 'fPt' os.rename(old_memo_name, new_memo_name) table = Table(self.vfp_table._meta.filename) self.assertEqual(table._meta.memoname, new_memo_name) with table: for rec, desc in zip(table, self.vfp_desclist): self.assertEqual(rec.desc, desc) class TestDbfNavigation(TestCase): def setUp(self): "create a dbf and vfp table" self.dbf_table = table = Table( os.path.join(tempdir, 'temptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) table.open(mode=READ_WRITE) namelist = self.dbf_namelist = [] paidlist = self.dbf_paidlist = [] qtylist = self.dbf_qtylist = [] orderlist = self.dbf_orderlist = [] desclist = self.dbf_desclist = [] for i in range(len(floats)): name = '%-25s' % words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc}) table.close() self.vfp_table = table = Table( os.path.join(tempdir, 'tempvfp'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' ' weight F(18,3); age I; meeting T; misc G; photo P', dbf_type='vfp', ) table.open(mode=READ_WRITE) namelist = self.vfp_namelist = [] paidlist = self.vfp_paidlist = [] qtylist = self.vfp_qtylist = [] orderlist = self.vfp_orderlist = [] desclist = self.vfp_desclist = [] masslist = self.vfp_masslist = [] weightlist = self.vfp_weightlist = [] agelist = self.vfp_agelist = [] meetlist = self.vfp_meetlist = [] misclist = self.vfp_misclist = [] photolist = self.vfp_photolist = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = floats[i] orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) mass = floats[i] * floats[i] / 2.0 weight = floats[i] * 3 age = numbers[i] meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, \ (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) misc = ' '.join(words[i:i+50:3]).encode('ascii') photo = ' '.join(words[i:i+50:7]).encode('ascii') namelist.append('%-25s' % name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) masslist.append(mass) weightlist.append(weight) agelist.append(age) meetlist.append(meeting) misclist.append(misc) photolist.append(photo) meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc, 'mass':mass, 'weight':weight, 'age':age, 'meeting':meeting, 'misc':misc, 'photo':photo}) table.close() def tearDown(self): self.dbf_table.close() self.vfp_table.close() def test_top(self): "top, current in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) mid = total // 2 table.goto(mid) list.goto(mid) index.goto(mid) self.assertTrue(table.current != -1) self.assertTrue(list.current != -1) self.assertTrue(index.current != -1) table.top() list.top() index.top() self.assertEqual(table.current, -1) self.assertEqual(list.current, -1) self.assertEqual(index.current, -1) def test_bottom(self): "bottom, current in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) mid = total // 2 table.goto(mid) list.goto(mid) index.goto(mid) self.assertTrue(table.current != -1) self.assertTrue(list.current != -1) self.assertTrue(index.current != -1) table.bottom() list.bottom() index.bottom() self.assertEqual(table.current, total) self.assertEqual(list.current, total) self.assertEqual(index.current, total) def test_goto(self): "goto, current in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) mid = total // 2 table.goto(mid) list.goto(mid) index.goto(mid) self.assertEqual(table.current, mid) self.assertEqual(list.current, mid) self.assertEqual(index.current, mid) table.goto('top') list.goto('top') index.goto('top') self.assertEqual(table.current, -1) self.assertEqual(list.current, -1) self.assertEqual(index.current, -1) table.goto('bottom') list.goto('bottom') index.goto('bottom') self.assertEqual(table.current, total) self.assertEqual(list.current, total) self.assertEqual(index.current, total) dbf.delete(table[10]) self.assertTrue(dbf.is_deleted(list[10])) self.assertTrue(dbf.is_deleted(index[10])) table.goto(10) list.goto(10) index.goto(10) self.assertEqual(table.current, 10) self.assertEqual(list.current, 10) self.assertEqual(index.current, 10) table.close() def test_skip(self): "skip, current in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) self.assertEqual(table.current, -1) self.assertEqual(list.current, -1) self.assertEqual(index.current, -1) table.skip(1) list.skip(1) index.skip(1) self.assertEqual(table.current, 0) self.assertEqual(list.current, 0) self.assertEqual(index.current, 0) table.skip(10) list.skip(10) index.skip(10) self.assertEqual(table.current, 10) self.assertEqual(list.current, 10) self.assertEqual(index.current, 10) table.close() def test_first_record(self): "first_record in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) self.assertTrue(table[0] is list[0]) self.assertTrue(table[0] is index[0]) self.assertTrue(table.first_record is table[0]) self.assertTrue(list.first_record is table[0]) self.assertTrue(index.first_record is table[0]) table.close() def test_prev_record(self): "prev_record in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) self.assertTrue(table[0] is list[0]) self.assertTrue(table[0] is index[0]) table.top() list.top() index.top() self.assertTrue(isinstance(table.prev_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.prev_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.prev_record, dbf.RecordVaporWare)) table.skip() list.skip() index.skip() self.assertTrue(isinstance(table.prev_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.prev_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.prev_record, dbf.RecordVaporWare)) table.skip() list.skip() index.skip() self.assertTrue(table.prev_record is table[0]) self.assertTrue(list.prev_record is table[0]) self.assertTrue(index.prev_record is table[0]) table.close() def test_current_record(self): "current_record in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) mid = total // 2 table.top() list.top() index.top() self.assertTrue(isinstance(table.current_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.current_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.current_record, dbf.RecordVaporWare)) table.bottom() list.bottom() index.bottom() self.assertTrue(isinstance(table.current_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.current_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.current_record, dbf.RecordVaporWare)) table.goto(mid) list.goto(mid) index.goto(mid) self.assertTrue(table.current_record is table[mid]) self.assertTrue(list.current_record is table[mid]) self.assertTrue(index.current_record is table[mid]) table.close() def test_next_record(self): "prev_record in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) self.assertTrue(table[0] is list[0]) self.assertTrue(table[0] is index[0]) table.bottom() list.bottom() index.bottom() self.assertTrue(isinstance(table.next_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.next_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.next_record, dbf.RecordVaporWare)) table.skip(-1) list.skip(-1) index.skip(-1) self.assertTrue(isinstance(table.next_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(list.next_record, dbf.RecordVaporWare)) self.assertTrue(isinstance(index.next_record, dbf.RecordVaporWare)) table.skip(-1) list.skip(-1) index.skip(-1) self.assertTrue(table.next_record is table[-1]) self.assertTrue(list.next_record is table[-1]) self.assertTrue(index.next_record is table[-1]) table.close() def test_last_record(self): "last_record in Tables, Lists, and Indexes" table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) index = Index(table, key=lambda rec: dbf.recno(rec)) total = len(table) self.assertTrue(table[-1] is list[-1]) self.assertTrue(table[-1] is index[-1]) self.assertTrue(table.last_record is table[-1]) self.assertTrue(list.last_record is table[-1]) self.assertTrue(index.last_record is table[-1]) table.close() class TestDbfLists(TestCase): "DbfList tests" def setUp(self): "create a dbf table" self.dbf_table = table = Table( os.path.join(tempdir, 'temptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) table.open(mode=READ_WRITE) records = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = round(floats[i], 5) orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) data = {'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc} table.append(data) records.append(data) table.close() table.open(mode=READ_WRITE) for trec, drec in zip(table, records): self.assertEqual(trec.name.strip(), drec['name']) self.assertEqual(trec.paid, drec['paid']) self.assertEqual(trec.qty, drec['qty']) self.assertEqual(trec.orderdate, drec['orderdate']) self.assertEqual(trec.desc, drec['desc']) table.close() def tearDown(self): self.dbf_table.close() def test_exceptions(self): table = self.dbf_table table.open(mode=READ_WRITE) list = table[::5] record = table[5] dbf.delete(record) self.assertTrue(list[0] is table[0]) self.assertTrue(record in list) self.assertRaises(TypeError, list.__contains__, 'some string') self.assertRaises(TypeError, list.__getitem__, 'some string') self.assertRaises(TypeError, list.__delitem__, 'some string') self.assertRaises(TypeError, list.remove, 'some string') self.assertRaises(TypeError, list.index, 'some string') self.assertRaises(IndexError, list.__getitem__, 100) self.assertRaises(IndexError, list.pop, 1000) self.assertRaises(IndexError, list.goto, 1000) list.top() self.assertRaises(Bof, list.skip, -1) list.bottom() self.assertRaises(Eof, list.skip) table.pack() self.assertRaises(DbfError, list.__contains__, record) list = List() self.assertRaises(IndexError, list.goto, 0) self.assertRaises(Bof, list.skip, -1) self.assertRaises(Eof, list.skip) self.assertRaises(ValueError, list.remove, table[0]) self.assertRaises(ValueError, list.index, table[1]) def test_add_subtract(self): "addition and subtraction" table1 = self.dbf_table table1.open(mode=READ_WRITE) list1 = table1[::2] list2 = table1[::3] list3 = table1[:] - list1 - list2 self.assertEqual(100, len(table1)) self.assertEqual(list1[0], list2[0]) self.assertEqual(list1[3], list2[2]) self.assertEqual(50, len(list1)) self.assertEqual(34, len(list2)) self.assertEqual(33, len(list3)) self.assertEqual(117, len(list1) + len(list2) + len(list3)) self.assertEqual(len(table1), len(list1 + list2 + list3)) self.assertEqual(67, len(list1 + list2)) self.assertEqual(33, len(list1 - list2)) self.assertEqual(17, len(list2 - list1)) table1.close() def test_append_extend(self): "appending and extending" table1 = self.dbf_table table1.open(mode=READ_WRITE) list1 = table1[::2] list2 = table1[::3] list3 = table1[:] - list1 - list2 list1.extend(list2) list2.append(table1[1]) self.assertEqual(67, len(list1)) self.assertEqual(35, len(list2)) list1.append(table1[1]) list2.extend(list3) self.assertEqual(68, len(list1)) self.assertEqual(67, len(list2)) table1.close() def test_index(self): "indexing" table1 = self.dbf_table table1.open(mode=READ_WRITE) list1 = table1[::2] list2 = table1[::3] list3 = table1[:] - list1 - list2 for i, rec in enumerate(list1): self.assertEqual(i, list1.index(rec)) for rec in list3: self.assertRaises(ValueError, list1.index, rec ) table1.close() def test_sort(self): "sorting" table1 = self.dbf_table table1.open(mode=READ_WRITE) list1 = table1[::2] list2 = table1[::3] table1[:] - list1 - list2 list4 = table1[:] index = table1.create_index(key = lambda rec: rec.name ) list4.sort(key=lambda rec: rec.name) for trec, lrec in zip(index, list4): self.assertEqual(dbf.recno(trec), dbf.recno(lrec)) table1.close() def test_keys(self): "keys" table1 = self.dbf_table table1.open(mode=READ_WRITE) field = table1.field_names[0] list1 = List(table1, key=lambda rec: rec[field]) unique = set() for rec in table1: if rec[field] not in unique: unique.add(rec[field]) else: self.assertRaises(NotFoundError, list1.index, rec) self.assertFalse(rec in list1) self.assertTrue(rec[field] in unique) self.assertEqual(len(unique), len(list1)) table1.close() def test_contains(self): table = self.dbf_table table.open(mode=READ_WRITE) list = List(table) i = 0 for record in list: self.assertEqual(record, list[i]) self.assertTrue(record in list) self.assertTrue(tuple(record) in list) self.assertTrue(scatter(record) in list) self.assertTrue(create_template(record) in list) i += 1 self.assertEqual(i, len(list)) table.close() class TestFieldnameLists(TestCase): "FieldnameList tests" def test_exceptions(self): self.assertRaises(TypeError, FieldnameList, [1]) self.assertRaises(TypeError, FieldnameList, ([u'1toy', int])) list1 = FieldnameList(unicodify(['lower', 'UPPER', 'MiXeD'])) self.assertRaises(TypeError, list1.__add__, [7]) self.assertRaises(TypeError, list1.__contains__, 7) self.assertRaises(TypeError, list1.__iadd__, [7]) self.assertRaises(TypeError, list1.__radd__, [7]) self.assertRaises(TypeError, list1.__setitem__, 0, 7) self.assertRaises(TypeError, list1.append, 7) self.assertRaises(TypeError, list1.count, 7) self.assertRaises(TypeError, list1.index, 7) self.assertRaises(TypeError, list1.insert, 7) self.assertRaises(TypeError, list1.remove, 7) def test_create(self): list1 = FieldnameList(['_this', 'that', 'somemore8']) list2 = list(list1) self.assertEqual(list2, unicodify(['_THIS', 'THAT', 'SOMEMORE8'])) self.assertEqual(list1, list2) def test_add(self): "addition" list1 = FieldnameList(unicodify(['lower', 'UPPER', 'MiXeD'])) list2 = FieldnameList(['wah', u'a\xf1o']) list3 = FieldnameList(unicodify(['heh', 'hah'])) # list4 = list1 + list2 self.assertEqual(list1, ['Lower', 'uppeR', 'Mixed']) self.assertEqual(list2, unicodify(['wah', u'A\xf1o'])) self.assertEqual(list4, unicodify(['loWer', 'UPpER', 'mixEd', 'wah', u'a\xf1O'])) self.assertTrue(isinstance(list4, FieldnameList)) # list4 += list3 self.assertEqual(list3, unicodify(['heh', 'hah'])) self.assertEqual(list4, unicodify(['LOWER', 'upper', 'MIxeD', 'wah', u'A\xf1O', 'heh', 'hah'])) self.assertTrue(isinstance(list4, FieldnameList)) # unicode_list = unicodify(['uhhuh', 'UhUh', 'zero']) self.assertEqual(unicode_list, [u'uhhuh', u'UhUh', u'zero']) list5 = unicode_list + list1 self.assertEqual(list1, unicodify(['LoWeR', 'uPpEr', 'MixED'])) self.assertEqual(list5, unicodify(['UhHuh', 'uHuH', 'zero', 'lowER', 'UPPer', 'miXeD'])) self.assertTrue(isinstance(list5, FieldnameList)) def test_append_extend(self): "appending and extending" list1 = FieldnameList(unicodify(['lowER', 'UPPer', 'miXeD'])) list2 = FieldnameList(['wah', u'a\xd1o']) list3 = FieldnameList(unicodify(['heh', 'hah'])) # list1.append('ten') self.assertEqual(list1, ['LOWer', 'uppER', 'MIxEd', 'ten']) list2.extend(unicodify(['prime', 'Maybe'])) self.assertEqual(list2, unicodify(['wah', u'A\xd1o', 'PRIME', 'maybe'])) # list3.extend(list1) self.assertEqual(list1, unicodify(['lower', 'UPPER', 'miXEd', 'ten'])) self.assertEqual(list3, unicodify(['heh', 'hah', 'Lower', 'uPPER', 'MiXEd', 'ten'])) def test_index(self): "indexing" list1 = FieldnameList(unicodify(['lOwEr', 'UpPeR', 'mIXed'])) list2 = FieldnameList(['wah', u'a\xd1O']) list3 = FieldnameList(unicodify(['heh', 'hah'])) # self.assertEqual(list1.index('lower'), 0) self.assertEqual(list2.index(u'A\xd1O'), 1) self.assertRaises(ValueError, list3.index, u'not there') self.assertRaises(ValueError, list3.index, 'not there') # slice1 = list1[:] slice2 = list2[:1] slice3 = list3[1:] self.assertTrue(isinstance(slice1, FieldnameList)) self.assertTrue(isinstance(slice2, FieldnameList)) self.assertTrue(isinstance(slice3, FieldnameList)) self.assertEqual(slice1, ['LOWER', 'UPPER', 'MIXED']) self.assertEqual(slice2, unicodify(['WAH'])) self.assertEqual(slice3, unicodify(['HAH'])) def test_sort(self): "sorting" list1 = FieldnameList(unicodify(['LoweR', 'uPPEr', 'MiXED'])) list2 = FieldnameList(['wah', u'A\xd1O']) list3 = FieldnameList(unicodify(['heh', 'hah'])) list1.sort() list2.sort() list3.sort() # self.assertEqual(list1, ['LOWER', 'MIXED', 'UPPER']) self.assertEqual(list2, unicodify([u'A\xD1O', 'WAH'])) self.assertEqual(list3, unicodify(['HAH', 'HEH'])) self.assertFalse(list3 != list3) self.assertFalse(list2 < list2) self.assertFalse(list1 > list1) # list4 = list2[:] list5 = list2[:] + ['bar'] list6 = list2[:] + unicodify(['size']) list4.sort() list5.sort() list6.sort() # self.assertTrue(list2 < list1) self.assertTrue(list2 <= list1) self.assertFalse(list2 == list1) self.assertFalse(list2 >= list1) self.assertFalse(list2 > list1) self.assertTrue(list2 == list4) self.assertTrue(list4 > list5) self.assertTrue(list5 < list6) self.assertTrue(list5 <= list6) self.assertTrue(list5 != list6) self.assertFalse(list5 >= list6) self.assertFalse(list5 > list6) self.assertTrue(list6 > list5) self.assertTrue(list6 < list4) def test_contains(self): list1 = FieldnameList(unicodify(['lower', 'UPPER', 'MiXeD'])) list2 = FieldnameList(['wah', u'a\xf1o']) list3 = FieldnameList(unicodify(['heh', 'hah'])) # self.assertTrue('Mixed' in list1) self.assertFalse(u'a\xf1o' in list1) self.assertTrue(u'A\xf1O' in list2) self.assertFalse('HEH' in list2) self.assertTrue(u'HEH' in list3) self.assertFalse(u'Mixed' in list3) class TestReadWriteDefaultOpen(TestCase): "test __enter__/__exit__" def setUp(self): "create a dbf table" self.dbf_table = table = Table( os.path.join(tempdir, 'temptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) table.open(READ_WRITE) table.append(('Rose Petals', True, 115, Date(2018, 2, 14), 'lightly scented, pink & red')) table.close() def tearDown(self): os.chmod(self.dbf_table.filename, stat.S_IWRITE|stat.S_IREAD) os.chmod(self.dbf_table._meta.memoname, stat.S_IWRITE|stat.S_IREAD) self.dbf_table.close() def test_context_manager(self): with self.dbf_table as t: t.append(dict(name='Stoneleaf', paid=True, qty=1)) def test_delete_fields(self): dbf.delete_fields(self.dbf_table.filename, 'orderdate') def test_add_fields(self): dbf.add_fields(self.dbf_table.filename, 'alias C(25)') def test_processing(self): for rec in dbf.Process(self.dbf_table): rec.name = 'Carnations' def test_read_only(self): table = self.dbf_table os.chmod(table.filename, stat.S_IREAD) os.chmod(table._meta.memoname, stat.S_IREAD) table.open(READ_ONLY) table.close() self.assertRaises((IOError, OSError), table.open, READ_WRITE) class TestDBC(TestCase): "test DBC handling" class TestVapor(TestCase): "test Vapor objects" def test_falsey(self): self.assertFalse(dbf.Vapor) class TestMisc(TestCase): "miscellaneous tests" def setUp(self): self.table = Table( os.path.join(tempdir, 'dbf_table.'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.table_dbf = Table( os.path.join(tempdir, 'dbf_table.dbf'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.table_implicit = Table( os.path.join(tempdir, 'dbf_table'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.table_wierd = Table( os.path.join(tempdir, 'dbf_table.blah'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3', ) self.table.close() self.table_dbf.close() self.table_implicit.close() self.table_wierd.close() def test_table_type_with_dbf(self): dbf.table_type(self.table.filename) dbf.table_type(self.table_dbf.filename) dbf.table_type(self.table_implicit.filename) dbf.table_type(self.table_wierd.filename) dbf.Table(self.table.filename) dbf.Table(self.table_dbf.filename) dbf.Table(self.table_implicit.filename) dbf.Table(self.table_wierd.filename) class TestWhatever(TestCase): "move tests here to run one at a time while debugging" def setUp(self): "create a dbf and vfp table" self.dbf_table = table = Table( os.path.join(tempdir, 'temptable'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M', dbf_type='db3' ) table.open(mode=READ_WRITE) namelist = self.dbf_namelist = [] paidlist = self.dbf_paidlist = [] qtylist = self.dbf_qtylist = [] orderlist = self.dbf_orderlist = [] desclist = self.dbf_desclist = [] for i in range(len(floats)): name = '%-25s' % words[i] paid = len(words[i]) % 3 == 0 qty = round(floats[i], 5) orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) namelist.append(name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc}) table.close() self.vfp_table = table = Table( os.path.join(tempdir, 'tempvfp'), 'name C(25); paid L; qty N(11,5); orderdate D; desc M; mass B;' ' weight F(18,3); age I; meeting T; misc G; photo P', dbf_type='vfp', ) table.open(mode=READ_WRITE) namelist = self.vfp_namelist = [] paidlist = self.vfp_paidlist = [] qtylist = self.vfp_qtylist = [] orderlist = self.vfp_orderlist = [] desclist = self.vfp_desclist = [] masslist = self.vfp_masslist = [] weightlist = self.vfp_weightlist = [] agelist = self.vfp_agelist = [] meetlist = self.vfp_meetlist = [] misclist = self.vfp_misclist = [] photolist = self.vfp_photolist = [] for i in range(len(floats)): name = words[i] paid = len(words[i]) % 3 == 0 qty = round(floats[i], 5) orderdate = datetime.date((numbers[i] + 1) * 2, (numbers[i] % 12) +1, (numbers[i] % 27) + 1) desc = ' '.join(words[i:i+50]) mass = floats[i] * floats[i] / 2.0 weight = round(floats[i] * 3, 3) age = numbers[i] meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, \ (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) misc = ' '.join(words[i:i+50:3]).encode('ascii') photo = ' '.join(words[i:i+50:7]).encode('ascii') namelist.append('%-25s' % name) paidlist.append(paid) qtylist.append(qty) orderlist.append(orderdate) desclist.append(desc) masslist.append(mass) weightlist.append(weight) agelist.append(age) meetlist.append(meeting) misclist.append(misc) photolist.append(photo) meeting = datetime.datetime((numbers[i] + 2000), (numbers[i] % 12)+1, (numbers[i] % 28)+1, (numbers[i] % 24), numbers[i] % 60, (numbers[i] * 3) % 60) table.append({'name':name, 'paid':paid, 'qty':qty, 'orderdate':orderdate, 'desc':desc, 'mass':mass, 'weight':weight, 'age':age, 'meeting':meeting, 'misc':misc, 'photo':photo}) table.close() def tearDown(self): self.dbf_table.close() self.vfp_table.close() # main if __name__ == '__main__': tempdir = tempfile.mkdtemp() try: unittest.main() finally: shutil.rmtree(tempdir, True) ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/dbf/utils.py0000664000175000017500000004433714770560041013733 0ustar00ethanethanfrom __future__ import print_function from glob import glob import codecs import collections import csv import os from . import dbf from .constants import * # utility functions def add_fields(table_name, field_specs): """ adds fields to an existing table """ table = dbf.Table(table_name) table.open(dbf.READ_WRITE) try: table.add_fields(field_specs) finally: table.close() def create_template(table_or_record, defaults=None): if isinstance(table_or_record, dbf.Table): return dbf.RecordTemplate(table_or_record._meta, defaults) else: return dbf.RecordTemplate(table_or_record._meta, table_or_record, defaults) def delete(record): """ marks record as deleted """ template = isinstance(record, dbf.RecordTemplate) if not template and record._meta.status == CLOSED: raise DbfError("%s is closed; cannot delete record" % record._meta.filename) record_in_flux = not record._write_to_disk if not template and not record_in_flux: record._start_flux() try: record._data[0] = ASTERISK if not template: record._dirty = True except: if not template and not record_in_flux: record._rollback_flux() raise if not template and not record_in_flux: record._commit_flux() def delete_fields(table_name, field_names): """ deletes fields from an existing table """ table = dbf.Table(table_name) table.open(dbf.READ_WRITE) try: table.delete_fields(field_names) finally: table.close() def ensure_unicode(value): if isinstance(value, bytes): if dbf.input_decoding is None: raise dbf.DbfError('value must be unicode, not bytes (or set input_decoding)') value = value.decode(dbf.input_decoding) return value def export(table_or_records, filename=None, field_names=None, format='csv', header=True, dialect='dbf', encoding=None, ignore_errors=False, strip_nulls=False): """ writes the records using CSV or tab-delimited format, using the filename given if specified, otherwise the table name if table_or_records is a collection of records (not an actual table) they should all be of the same format """ table = source_table(table_or_records[0]) if filename is None: filename = table.filename if field_names is None: field_names = table.field_names if isinstance(field_names, basestring): field_names = [f.strip() for f in field_names.split(',')] format = format.lower() if format not in ('csv', 'tab', 'fixed'): raise dbf.DbfError("export format: csv, tab, or fixed -- not %s" % format) if format == 'fixed': format = 'txt' if encoding is None: encoding = table._meta.codepage header_names = field_names base, ext = os.path.splitext(filename) if ext.lower() in ('', '.dbf'): filename = base + "." + format with codecs.open(filename, 'w', encoding=encoding) as fd: if format == 'csv': if header is True: fd.write(','.join(header_names)) fd.write('\n') elif header: fd.write(','.join(header)) fd.write('\n') for record in table_or_records: fields = [] for fieldname in field_names: try: data = record[fieldname] except Exception: if not ignore_errors: raise data = None if isinstance(data, basestring) and data: data = '"%s"' % data.replace('"','""') elif data is None: data = '' data = unicode(data) if strip_nulls: data = data.replace(NULL, '') fields.append(data) fd.write(','.join(fields)) fd.write('\n') elif format == 'tab': if header is True: fd.write('\t'.join(header_names) + '\n') elif header: fd.write(','.join(header)) for record in table_or_records: fields = [] for fieldname in field_names: try: data = record[fieldname] except Exception: if not ignore_errors: raise data = None data = unicode(data) if strip_nulls: data = data.replace(NULL, '') fields.append(data) fd.write('\t'.join(fields) + '\n') else: # format == 'fixed' if header is True: header = False # don't need it elif header: # names to use as field names header = list(header) # in case header is an iterator with codecs.open("%s_layout.txt" % os.path.splitext(filename)[0], 'w', encoding=encoding) as layout: layout.write("%-15s Size Comment\n" % "Field Name") layout.write("%-15s ---- -------------------------\n" % ("-" * 15)) sizes = [] for i, field in enumerate(field_names): info = table.field_info(field) if info.field_type == ord('D'): size = 10 elif info.field_type in (ord('T'), ord('@')): size = 19 else: size = info.length sizes.append(size) comment = '' if header and i < len(header): # use overridden field name as comment comment = header[i] layout.write("%-15s %4d %s\n" % (field, size, comment)) layout.write('\nTotal Records in file: %d\n' % len(table_or_records)) for record in table_or_records: fields = [] for i, fieldname in enumerate(field_names): try: data = record[fieldname] except Exception: if not ignore_errors: raise data = None data = unicode(data) if strip_nulls: data = data.replace(NULL, '') fields.append("%-*s" % (sizes[i], data)) fd.write(''.join(fields) + '\n') return len(table_or_records) def field_names(thing): """ fields in table/record, keys in dict if dict and dict keys are not unicode, returned keys will also not be unicode; either way, they will not be upper-cased """ if isinstance(thing, dict): return list(thing.keys()) elif isinstance(thing, (dbf.Table, dbf.Record, dbf.RecordTemplate)): return thing._meta.user_fields[:] elif isinstance(thing, dbf.Index): return thing._table._meta.user_fields[:] else: for record in thing: # grab any record return record._meta.user_fields[:] def first_record(table_name): """ prints the first record of a table """ table = dbf.Table(table_name) table.open() try: print(unicode(table[0])) finally: table.close() def from_csv(csvfile, to_disk=False, filename=None, field_names=None, extra_fields=None, dbf_type='db3', memo_size=64, min_field_size=1, encoding=None, errors=None): """ creates a Character table from a csv file to_disk will create a table with the same name filename will be used if provided field_names default to f0, f1, f2, etc, unless specified (list) extra_fields can be used to add additional fields -- should be normal field specifiers (list) """ with codecs.open(csvfile, 'r', encoding='latin-1', errors=errors) as fd: reader = csv.reader(fd) if field_names: if isinstance(field_names, basestring): field_names = field_names.split() if ' ' not in field_names[0]: field_names = ['%s M' % fn for fn in field_names] else: field_names = ['f0 M'] if filename: to_disk = True else: filename = os.path.splitext(csvfile)[0] if to_disk: csv_table = dbf.Table(filename, [field_names[0]], dbf_type=dbf_type, memo_size=memo_size, codepage=encoding) else: csv_table = dbf.Table(':memory:', [field_names[0]], dbf_type=dbf_type, memo_size=memo_size, codepage=encoding, on_disk=False) csv_table.open(dbf.READ_WRITE) fields_so_far = 1 while reader: try: row = next(reader) except UnicodeEncodeError: row = [''] except StopIteration: break while fields_so_far < len(row): if fields_so_far == len(field_names): field_names.append('f%d M' % fields_so_far) csv_table.add_fields(field_names[fields_so_far]) fields_so_far += 1 csv_table.append(tuple(row)) if extra_fields: csv_table.add_fields(extra_fields) csv_table.close() return csv_table def guess_table_type(filename): reported = table_type(filename) possibles = [] version = reported[0] for tabletype in (dbf.Db3Table, dbf.ClpTable, dbf.FpTable, dbf.VfpTable): if version in tabletype._supported_tables: possibles.append((tabletype._versionabbr, tabletype._version, tabletype)) if not possibles: raise dbf.DbfError("Tables of type %s not supported" % unicode(reported)) return possibles def get_fields(table_name): """ returns the list of field names of a table """ table = dbf.Table(table_name) return table.field_names def hex_dump(records): """ just what it says ;) """ for index, dummy in enumerate(records): chars = dummy._data print("%2d: " % (index,)) for char in chars[1:]: print(" %2x " % (char,)) print() def index(sequence): """ returns integers 0 - len(sequence) """ for i in xrange(len(sequence)): yield i def info(table_name): """ prints table info """ table = dbf.Table(table_name) print(unicode(table)) def is_deleted(record): """ marked for deletion? """ return record._data[0] == ASTERISK def is_leapyear(year): if year % 400 == 0: return True elif year % 100 == 0: return False elif year % 4 == 0: return True else: return False def recno(record): """ physical record number """ return record._recnum def rename_field(table_name, oldfield, newfield): """ renames a field in a table """ table = dbf.Table(table_name) try: table.rename_field(oldfield, newfield) finally: table.close() def reset(record, keep_fields=None): """ sets record's fields back to blank values, except for fields in keep_fields """ template = record_in_flux = False if isinstance(record, dbf.RecordTemplate): template = True else: record_in_flux = not record._write_to_disk if record._meta.status == CLOSED: raise dbf.DbfError("%s is closed; cannot modify record" % record._meta.filename) if keep_fields is None: keep_fields = [] keep = {} for field in keep_fields: keep[field] = record[field] record._data[:] = record._meta.blankrecord[:] for field in keep_fields: record[field] = keep[field] if not template: if record_in_flux: record._dirty = True else: record._write() def source_table(thingie): """ table associated with table | record | index """ table = thingie._meta.table() if table is None: raise dbf.DbfError("table is no longer available") return table def string(text): if isinstance(text, unicode): return text elif isinstance(text, bytes): return text.decode(dbf.input_decoding) def structure(table_name, field=None): """ returns the definition of a field (or all fields) """ table = dbf.Table(table_name) return table.structure(field) def table_type(filename): """ returns text representation of a table's dbf version """ actual_filename = None search_name = None base, ext = os.path.splitext(filename) if ext == '.': # use filename without the '.' search_name = base matches = glob(search_name) elif ext.lower() == '.dbf': # use filename as-is search_name = filename matches = glob(search_name) else: search_name = base + '.[Dd][Bb][Ff]' matches = glob(search_name) if not matches: # back to original name search_name = filename matches = glob(search_name) if len(matches) == 1: actual_filename = matches[0] elif matches: raise dbf.DbfError("please specify exactly which of %r you want" % (matches, )) else: raise dbf.DbfError('File %r not found' % search_name) fd = open(actual_filename, 'rb') version = ord(fd.read(1)) fd.close() fd = None if not version in dbf.tables.version_map: raise dbf.DbfError("Unknown dbf type: %s (%x)" % (version, version)) return version, dbf.tables.version_map[version] def undelete(record): """ marks record as active """ template = isinstance(record, dbf.RecordTemplate) if not template and record._meta.status == CLOSED: raise dbf.DbfError("%s is closed; cannot undelete record" % record._meta.filename) record_in_flux = not record._write_to_disk if not template and not record_in_flux: record._start_flux() try: record._data[0] = SPACE if not template: record._dirty = True except: if not template and not record_in_flux: record._rollback_flux() raise if not template and not record_in_flux: record._commit_flux() def write(record, **kwargs): """ write record data to disk (updates indices) """ if record._meta.status == CLOSED: raise dbf.DbfError("%s is closed; cannot update record" % record._meta.filename) elif not record._write_to_disk: raise dbf.DbfError("unable to use .write_record() while record is in flux") if kwargs: gather(record, kwargs) if record._dirty: record._write() def Process(records, start=0, stop=None, filter=None): """commits each record to disk before returning the next one; undoes all changes to that record if exception raised if records is a table, it will be opened and closed if necessary filter function should return True to skip record, False to keep""" already_open = True if isinstance(records, dbf.Table): already_open = records.status != CLOSED if not already_open: records.open(dbf.READ_WRITE) try: if stop is None: stop = len(records) for record in records[start:stop]: if filter is not None and filter(record): continue try: record._start_flux() yield record except: record._rollback_flux() raise else: record._commit_flux() finally: if not already_open: records.close() def Templates(records, start=0, stop=None, filter=None): """ returns a template of each record instead of the record itself if records is a table, it will be opened and closed if necessary """ already_open = True if isinstance(records, dbf.Table): already_open = records.status != CLOSED if not already_open: records.open() try: if stop is None: stop = len(records) for record in records[start:stop]: if filter is not None and filter(record): continue yield(create_template(record)) finally: if not already_open: records.close() # Foxpro functions def gather(record, data, drop=False): """ saves data into a record's fields; writes to disk if not in flux keys with no matching field will raise a FieldMissingError exception unless drop_missing == True; if an Exception occurs the record is restored before reraising """ if isinstance(record, dbf.Record) and record._meta.status == CLOSED: raise dbf.DbfError("%s is closed; cannot modify record" % record._meta.filename) record_in_flux = not record._write_to_disk if not record_in_flux: record._start_flux() try: record_fields = field_names(record) for key in field_names(data): value = data[key] key = ensure_unicode(key).upper() if not key in record_fields: if drop: continue raise dbf.FieldMissingError(key) record[key] = value except: if not record_in_flux: record._rollback_flux() raise if not record_in_flux: record._commit_flux() def scan(table, direction='forward', filter=lambda rec: True): """ moves record pointer forward 1; returns False if Eof/Bof reached table must be derived from _Navigation or have skip() method """ if direction not in ('forward', 'reverse'): raise TypeError("direction should be 'forward' or 'reverse', not %r" % direction) if direction == 'forward': n = +1 no_more_records = dbf.Eof else: n = -1 no_more_records = dbf.Bof try: while True: table.skip(n) if filter(table.current_record): return True except no_more_records: return False def scatter(record, as_type=create_template, _mappings=getattr(collections, 'Mapping', dict)): """ returns as_type() of [fieldnames and] values. """ if isinstance(as_type, type) and issubclass(as_type, _mappings): return as_type(zip(field_names(record), record)) else: return as_type(record) ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1742921762.6041298 dbf-0.99.10/dbf.egg-info/0000775000175000017500000000000014770560043013702 5ustar00ethanethan././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921762.0 dbf-0.99.10/dbf.egg-info/PKG-INFO0000664000175000017500000000300014770560042014767 0ustar00ethanethanMetadata-Version: 2.1 Name: dbf Version: 0.99.10 Summary: Pure python package for reading/writing dBase, FoxPro, and Visual FoxPro .dbf files (including memos) Home-page: https://github.com/ethanfurman/dbf Author: Ethan Furman Author-email: ethan@stoneleaf.us License: BSD License Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: BSD License Classifier: Programming Language :: Python Classifier: Topic :: Database Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3.8 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Classifier: Programming Language :: Python :: 3.13 Provides: dbf Currently supports dBase III, Clipper, FoxPro, and Visual FoxPro tables. Text is returned as unicode, and codepage settings in tables are honored. Memos and Null fields are supported. Documentation needs work, but author is very responsive to e-mails. Not supported: index files (but can create tempory non-file indexes), auto-incrementing fields, and Varchar fields. Installation: `pip install dbf` ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921762.0 dbf-0.99.10/dbf.egg-info/SOURCES.txt0000664000175000017500000000052514770560042015567 0ustar00ethanethansetup.py dbf/LICENSE dbf/README.md dbf/WHATSNEW dbf/__init__.py dbf/_index.py dbf/bridge.py dbf/constants.py dbf/data_types.py dbf/exceptions.py dbf/index.py dbf/pql.py dbf/tables.py dbf/test.py dbf/utils.py dbf.egg-info/PKG-INFO dbf.egg-info/SOURCES.txt dbf.egg-info/dependency_links.txt dbf.egg-info/requires.txt dbf.egg-info/top_level.txt././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921762.0 dbf-0.99.10/dbf.egg-info/dependency_links.txt0000664000175000017500000000000114770560042017747 0ustar00ethanethan ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921762.0 dbf-0.99.10/dbf.egg-info/requires.txt0000664000175000017500000000000614770560042016275 0ustar00ethanethanaenum ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921762.0 dbf-0.99.10/dbf.egg-info/top_level.txt0000664000175000017500000000000414770560042016425 0ustar00ethanethandbf ././@PaxHeader0000000000000000000000000000003400000000000010212 xustar0028 mtime=1742921762.6041298 dbf-0.99.10/setup.cfg0000664000175000017500000000004614770560043013276 0ustar00ethanethan[egg_info] tag_build = tag_date = 0 ././@PaxHeader0000000000000000000000000000002600000000000010213 xustar0022 mtime=1742921761.0 dbf-0.99.10/setup.py0000775000175000017500000000420514770560041013171 0ustar00ethanethantry: import setuptools setuptools except ImportError: pass from distutils.core import setup #html_docs = glob('dbf/html/*') long_desc=""" Currently supports dBase III, Clipper, FoxPro, and Visual FoxPro tables. Text is returned as unicode, and codepage settings in tables are honored. Memos and Null fields are supported. Documentation needs work, but author is very responsive to e-mails. Not supported: index files (but can create tempory non-file indexes), auto-incrementing fields, and Varchar fields. Installation: `pip install dbf` """ py2_only = () py3_only = () make = [] data = dict( name='dbf', version='0.99.10', license='BSD License', description='Pure python package for reading/writing dBase, FoxPro, and Visual FoxPro .dbf files (including memos)', long_description=long_desc, url='https://github.com/ethanfurman/dbf', packages=['dbf', ], package_data={ 'dbf' : [ 'LICENSE', 'README.md', 'WHATSNEW', ] }, provides=['dbf'], install_requires=['aenum'], author='Ethan Furman', author_email='ethan@stoneleaf.us', classifiers=[ 'Development Status :: 4 - Beta', 'Intended Audience :: Developers', 'License :: OSI Approved :: BSD License', 'Programming Language :: Python', 'Topic :: Database', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3.3', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Programming Language :: Python :: 3.7', 'Programming Language :: Python :: 3.8', 'Programming Language :: Python :: 3.9', 'Programming Language :: Python :: 3.10', 'Programming Language :: Python :: 3.11', 'Programming Language :: Python :: 3.12', 'Programming Language :: Python :: 3.13', ], ) if __name__ == '__main__': setup(**data)