mysql Data truncated for column 'x' at row y warning

Hello, When storing a string that is longer then the table definition, is it possible to configure mysql to treat this condition as an error and abort the operation, rather then blindly truncating/corrupting the data and reporting a warning? Or does this require using transactions[1]? I thought I saw a solution somewhere ages ago, but can't find it right now. Thanks Notes: [1] IIRC transaction support requires InnoDB and some applications require MyISAM - e.g. for geospatial stuff. Of course another option would be to switch to a better DB, eg postgresql. -- Brian May <brian@microcomaustralia.com.au>

Hi Brian
When storing a string that is longer then the table definition, is it possible to configure mysql to treat this condition as an error and abort the operation, rather then blindly truncating/corrupting the data and reporting a warning? Or does this require using transactions[1]?
sql_mode=TRADITIONAL sets a whole set of options that will do what you want. See ref manual for more detail. How effective it is does depend on the underlying storage engine's ability to rollback, regardless of whether you use an explicit transaction. That is, it'll be ok on any operation affecting only a single row, otherwise use InnoDB or another transactional engine. This should be the default anyway for most production environments. MyISAM has specific capabilities and uses, and can be used when appropriate (it was architected for data warehousing, and is fast at bulk inserts currently with selects and particularly fast table scans).
[1] IIRC transaction support requires InnoDB and some applications require MyISAM - e.g. for geospatial stuff. Of course another option would be to switch to a better DB, eg postgresql.
If an app requires extensive GIS support, I definitely recommend PostgreSQL as it's particularly suited to that task. I see no benefit in a discussion like "is green better than blue" - they are both valid and useful colours. Regards, Arjen. -- Exec.Director @ Open Query (http://openquery.com) MySQL services Sane business strategy explorations at http://Upstarta.biz Personal blog at http://lentz.com.au/blog/

On 5 December 2011 21:06, Arjen Lentz <arjen@lentz.com.au> wrote:
When storing a string that is longer then the table definition, is it possible to configure mysql to treat this condition as an error and abort the operation, rather then blindly truncating/corrupting the data and reporting a warning? Or does this require using transactions[1]?
sql_mode=TRADITIONAL sets a whole set of options that will do what you want. See ref manual for more detail.
Ok, so that needs to be set for every session, right? i.e. in the client code (I use django).
If an app requires extensive GIS support, I definitely recommend PostgreSQL as it's particularly suited to that task. I see no benefit in a discussion like "is green better than blue" - they are both valid and useful colours.
I don't particularly trust the django mysql code, suspect they didn't really know what they were doing. e.g. this code is called very very frequently: def _valid_connection(self): if self.connection is not None: try: self.connection.ping() return True except DatabaseError: self.connection.close() self.connection = None return False I seem to recall, previous research some time ago they did this as a work around in case the "unexpectedly" closed the connection. As a result, there are many more round trips to the database then needed. My gut feeling is there has to be a better way or resolving any issues then sending a ping packet before every request. Of course I couldn't find the discussion with a quick google search, will try again latter. -- Brian May <brian@microcomaustralia.com.au>

Hi Brian ----- Original Message -----
On 5 December 2011 21:06, Arjen Lentz <arjen@lentz.com.au> wrote:
When storing a string that is longer then the table definition, is it possible to configure mysql to treat this condition as an error and abort the operation, rather then blindly truncating/corrupting the data and reporting a warning? Or does this require using transactions[1]?
sql_mode=TRADITIONAL sets a whole set of options that will do what you want. See ref manual for more detail.
Ok, so that needs to be set for every session, right? i.e. in the client code (I use django).
You can put that line (perhaps you have to double-quote the TRADITIONAL word, see refman) in my.cnf in the [mysqld] block, then it's global.
If an app requires extensive GIS support, I definitely recommend PostgreSQL as it's particularly suited to that task. I see no benefit in a discussion like "is green better than blue" - they are both valid and useful colours.
I don't particularly trust the django mysql code, suspect they didn't really know what they were doing. e.g. this code is called very very frequently:
def _valid_connection(self): if self.connection is not None: try: self.connection.ping() return True except DatabaseError: self.connection.close() self.connection = None return False
I seem to recall, previous research some time ago they did this as a work around in case the "unexpectedly" closed the connection. As a result, there are many more round trips to the database then needed.
My gut feeling is there has to be a better way or resolving any issues then sending a ping packet before every request.
Ya they didn't read/follow the client library info. As you say, unnecessary overhead. The proper way is as follows: If the connection was lost, a query call will return a specific error, and the client library will reconnect. The app then just needs to re-issue the same query - or in case it was in the middle of a transaction, restart that transaction (obviously). Cheers, Arjen. -- Exec.Director @ Open Query (http://openquery.com) MySQL services Sane business strategy explorations at http://Upstarta.biz Personal blog at http://lentz.com.au/blog/
participants (2)
-
Arjen Lentz
-
Brian May