-
Notifications
You must be signed in to change notification settings - Fork 918
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARKC-516: Add column name to the error message in case of conversion error #1155
base: master
Are you sure you want to change the base?
Conversation
buffer(i) = converters(i).convert(colValue) | ||
buffer(i) = Try(converters(i).convert(colValue)) match { | ||
case Success(value) => value | ||
case Failure(ex: IllegalArgumentException) => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The most exceptions I've seen that were hard to track down were either IllegalArgumentException
or its subclass.
case Failure(ex: IllegalArgumentException) => | ||
val columnType = columnTypes(i).scalaTypeName | ||
throw new TypeConversionException( | ||
s"Cannot convert value $colValue of column ${columnNames(i)} to type $columnType", ex) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The message will look like Cannot convert value somevalue of column somecolumn to type Int
without any wrapping of somevalue
(like single or double quotes). Any suggestions on how to improve readability?
We probably will want to do some perf testing around this change since this is adding a try catch into a pretty tight loop. |
Would it be ok to change it from using |
Mostly I was worried about the Try/Catch block in the very inner loop of the conversion here. Sorry I've been so long in responding, I do think that for fixing this issue better maybe we should force Spark to do the conversion IE not accepting a String for a numeric type ... or something like that so Spark does the conversion instead of us in the type converter code ... Gotta think more on this ... |
The change adds new exception on top of the actual conversion exception with debug information about column name and expected type.