Changes between Initial Version and Version 1 of Ticket #11487, comment 39


Ignore:
Timestamp:
Aug 12, 2015, 5:11:08 PM (9 years ago)
Author:
RohitV24

Legend:

Unmodified
Added
Removed
Modified
  • Ticket #11487, comment 39

    initial v1  
    1 We are currently experiencing the same issue while inserting large strings (between size 2000-4000 characters) into the Oracle database.The NLS_NCHAR_CHARACTERSET on our database is  AL16UTF16 which assigns 2 bytes for a char .From digging in a little deeper, it looks like a string is mapped to cx_Oracle.STRING which is then mapped to either a VARCHAR, NVARCHAR or LONG in Oracle and the conversion to long in case of long values is causing the error. It looks like the issue with 4000 characters was fixed by setting the input size to cx_Oracle.CLOB when it reached the character limit. Using 2000( for utf-16) seems to work fine and solve the problem. Would setting the comparison value to 1000 (taking into consideration other encoding formats) before setting it to CLOB be the fix for this issue?
     1We are currently experiencing the same issue while inserting large strings (between 2000-4000 characters) into the Oracle database and it looks like a few tickets were opened which have been closed as duplicate of this.The NLS_NCHAR_CHARACTERSET on our database is  AL16UTF16 which assigns 2 bytes for a char .From digging in a little deeper, it looks like a string is mapped to cx_Oracle.STRING which is then mapped to either a VARCHAR, NVARCHAR or LONG in Oracle and the conversion to long in case of long values is causing the error. It looks like the issue with 4000 characters was fixed by setting the input size to cx_Oracle.CLOB when it reached the character limit. Using 2000( for utf-16) seems to work fine and solve the problem. Would setting the comparison value to 1000 (taking into consideration other encoding formats) before setting it to CLOB be the fix for this issue?
     2
     3Environment: Oracle 10g, Django 1.7.7, cx_Oracle 5.2
Back to Top