Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curious as to why this was necessary #38

Open
musashiXXX opened this issue Nov 5, 2018 · 4 comments
Open

Curious as to why this was necessary #38

musashiXXX opened this issue Nov 5, 2018 · 4 comments

Comments

@musashiXXX
Copy link

musashiXXX commented Nov 5, 2018

elif "NUMERIC" in base_classes\
or "FLOAT" in base_classes\
or "DECIMAL" in base_classes:
####################################
# Check all cleaned_rows to determine
# if column is decimal or integer
####################################
mantissa_max_digits = 0
left_hand_max_digits = 0
mantissa_gt_zero = False
intCount = 0
maxDigit = 0
type_count = {}
types = set([])
for row in raw_rows:
data = row[
self.current_ordered_table_columns.index(
column.name)]
types.add(data.__class__.__name__)
if type_count.get(data.__class__.__name__):
type_count[data.__class__.__name__] += 1
else:
type_count[data.__class__.__name__] = 1
######################
# Check for NULL data
# (We will drop column if all rows contain null)
######################
if data is not None:
null = False
if data.__class__.__name__ == 'Decimal' or\
data.__class__.__name__ == 'float':
splt = str(data).split(".")
if len(splt) == 1:
intCount += 1
maxDigit = max(data, maxDigit)
continue
left_hand_digits = splt[0]
mantissa_digits = splt[1]
# Store greatest mantissa to check for decimal cols that
# should be integers...(i.e. if m = 3.000)
mantissa_max_digits = max(mantissa_max_digits,
len(mantissa_digits))
left_hand_max_digits = max(left_hand_max_digits,
len(left_hand_digits))
# If we have a mantissa greater than zero, we can keep this column as a decimal
if not mantissa_gt_zero and float(mantissa_digits) > 0:
# Short circuit the above 'and' so we don't keep resetting mantissa_gt_zero
mantissa_gt_zero = True
elif data.__class__.__name__ == 'int':
intCount += 1
maxDigit = max(data, maxDigit)
self.logger.info(" --> " + str(column.name) +
"..." + str(type_count))
#self.logger.info("Max Digit Length: {0}".format(str(len(str(maxDigit)))))
#self.logger.info("Max Mantissa Digits: {0}".format(str(mantissa_max_digits)))
#self.logger.info("Max Left Hand Digit: {0}".format(str(left_hand_max_digits)))
#self.logger.info("Total Left Max Digits: {0}".format(str(max(len(str(maxDigit)), left_hand_max_digits))))
if mantissa_gt_zero:
cum_max_left_digits = max(
len(str(maxDigit)), (left_hand_max_digits))
self.logger.info("Numeric({0}, {1})".format(str(cum_max_left_digits + mantissa_max_digits), str(mantissa_max_digits)))
column_copy.type = Numeric(
precision=cum_max_left_digits + mantissa_max_digits,
scale=mantissa_max_digits)
if intCount > 0:
self.logger.warning(
"Column '" +
column.name +
"' contains decimals and integers, " +
"resorting to type 'Numeric'")
if column.primary_key:
self.logger.warning(
"Column '" +
column.name +
"' is a primary key, but is of type 'Decimal'")
else:
self.logger.warning(
"Column '" +
column.name +
"' is of type 'Decimal', but contains no mantissas " +
"> 0. (i.e. 3.00, 2.00, etc..)\n ")
if maxDigit > 4294967295:
self.logger.warning("Coercing to 'BigInteger'")
column_copy.type = BigInteger()
# Do conversion...
for r in raw_rows:
if r[idx] is not None:
r[idx] = long(r[idx])
else:
column_copy.type = Integer()
self.logger.warning("Coercing to 'Integer'")
for r in raw_rows:
if r[idx] is not None:
r[idx] = int(r[idx])

I'm curious about why we'd want to convert all numbers with a decimal point (decimals, floats, etc.) to integers/bigintegers (i.e., dropping the decimal). Leaving this code in place appears to cause errors like this:

psycopg2.DataError: value "3710090300" is out of range for type integer

I've successfully migrated a ~40+ GB MSSQL database to PostgreSQL with this behavior effectively disabled, but I'd still like to know why it was necessary in case I'm missing something.

Thanks!

@tpow
Copy link
Contributor

tpow commented Dec 18, 2018

Curious is one word for it. I find this unexpected and generally undesirable behavior. Even if values in a column are whole numbers, I do not necessarily want to change the data type to eliminate decimal. For example, a prices column that just happens to use whole dollars should not be changed. This should be optional behavior and not the default. To me, this is similar to the compress_varchar option. Altering the varchar size is not the default. Changing decimal to integer should not be the default.

Worse, if I'm following the code correctly, this decimal assessment behavior (possibly connected with the option to drop empty columns) means that the schema migration reads all the data. That should not need to be the case.

@seanharr11
Copy link
Owner

Hi all - this was open-sourced in haste as an MVP that I had hoped others could benefit from. The original use-case migrated a small-ish Oracle 9i database into MySQL, and the state of that initial DB was a mess. One of the issues with the database was that it didn't use INTEGER columns anywhere, despite there being well over 100 columns that should have been INTEGERs. So, I decided to scan every value in a DECIMAL/NUMERIC column, and if all of them could safely be coerced to an INTEGER, I did so.

You may disagree with this behavior, and that is fine...it should proabably be an optional kwarg. That said, it should only effect your migration if you have a Decimal column with ALL 0's in ALL decimal places in the DB.

This library needs a re-write now that it has some users/followers. Bigger concerns are establishing proper test coverage & addressing scalability, but all of these small things need to cleaned up as well

@musashiXXX
Copy link
Author

musashiXXX commented Dec 19, 2018

First, thank you. I've managed to get further along in my own work due to this project. My scenario involves migrating a ~40Gb Microsoft SQL Server database to PostgreSQL... as you can imagine, this is not a simple task.

You may disagree with this behavior, and that is fine...

It's not so much a disagreement (for me at least). I figured there was some reason behind it; having the explanation is helpful and I'm sure it's a situation most of us can relate to. That said:

it should proabably be an optional kwarg

I actually have an implementation of this on my local copy but it's not clean enough (yet) for a PR.

This library needs a re-write now that it has some users/followers. Bigger concerns are establishing proper test coverage & addressing scalability, but all of these small things need to cleaned up as well

I'm more than happy to contribute as much as I can. Thanks again for the work you've put into this project.

EDIT: I forgot to add:

it should only effect your migration if you have a Decimal column with ALL 0's in ALL decimal places in the DB.

My source database contains a lot of columns with whole dollar amounts. There are also some columns that need to be BIGINT (integer is not large enough) but they end up getting coerced to INT. Upon insertion into the destination database, we get an error. I have some more details in my notes but that's the gist.

@tpow
Copy link
Contributor

tpow commented Dec 19, 2018

I apologize, I neglected to say a big thank you for the code. I appreciate your kindness in sharing it. Despite this curious behavior, it has overall been quite helpful. It makes sense that this was a side-effect of cleaning up a messy database.

I'm working on various fixes and test cases.

Similar to @musashiXXX, I had a problem with a decimal column that had all the data rows with whole values (zero in all decimal places). It was coerced to INT type. Arguably it doesn't affect anything with my data at the time of migration, but was unexpected and could be a problem later if I hadn't noticed it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants