1 | | We are currently experiencing the same issue while inserting large strings (between size 2000-4000 characters) into the Oracle database.The NLS_NCHAR_CHARACTERSET on our database is AL16UTF16 which assigns 2 bytes for a char .From digging in a little deeper, it looks like a string is mapped to cx_Oracle.STRING which is then mapped to either a VARCHAR, NVARCHAR or LONG in Oracle and the conversion to long in case of long values is causing the error. It looks like the issue with 4000 characters was fixed by setting the input size to cx_Oracle.CLOB when it reached the character limit. Using 2000( for utf-16) seems to work fine and solve the problem. Would setting the comparison value to 1000 (taking into consideration other encoding formats) before setting it to CLOB be the fix for this issue? |
| 1 | We are currently experiencing the same issue while inserting large strings (between 2000-4000 characters) into the Oracle database and it looks like a few tickets were opened which have been closed as duplicate of this.The NLS_NCHAR_CHARACTERSET on our database is AL16UTF16 which assigns 2 bytes for a char .From digging in a little deeper, it looks like a string is mapped to cx_Oracle.STRING which is then mapped to either a VARCHAR, NVARCHAR or LONG in Oracle and the conversion to long in case of long values is causing the error. It looks like the issue with 4000 characters was fixed by setting the input size to cx_Oracle.CLOB when it reached the character limit. Using 2000( for utf-16) seems to work fine and solve the problem. Would setting the comparison value to 1000 (taking into consideration other encoding formats) before setting it to CLOB be the fix for this issue? |
| 2 | |
| 3 | Environment: Oracle 10g, Django 1.7.7, cx_Oracle 5.2 |