-
Suggestion
-
Resolution: Unresolved
-
None
-
None
-
None
-
15
-
2
-
NOTE: This suggestion is for Confluence Server. Using Confluence Cloud? See the corresponding suggestion.
Confluence upgrade task will failed if the pre-upgrade Oracle DB dialect uses column data types that do not support Unicode (VARCHAR data type). This will need a modification of the database column that uses VARCHAR2 datatype to NVARCHAR2 datatype as per knowledge base article below:
Therefore, it is much convenient if we could implement a schema to accept the VARCHAR2 datatype for ease of upgrading Confluence.
Snippet from Oracle documentation which recommend using VARCHAR2 over NVARCHAR2 datatype:
Link: http://docs.oracle.com/database/121/NLSPG/ch6unicode.htm#NLSPG319
- is caused by
-
CONFSERVER-54048 Confluence instances using Oracle DBMS may not be able to upgrade to version 5.9 and later
-
- Gathering Impact
-
- is related to
-
CONFSERVER-99185 9.2 LTS Backport - Confluence upgrade fails on Oracle db that using VARCHAR2 when executing label upgrade task
-
- Closed
-
-
CONFSERVER-99168 Confluence upgrade fails on Oracle db that using VARCHAR2 when executing label upgrade task
-
- Waiting for Release
-
- relates to
-
CONFCLOUD-51935 Implement Oracle database schema to accept VARCHAR2 datatype for Confluence database column.
- Closed
- is blocked by
-
PSR-51 You do not have permission to view this issue
Form Name |
---|
As stated in your document, Atlassian recommand "net.sf.hibernate.dialect.OracleIntlDialect" as the dialect.
But when Confluence start, it always change the dialect to "com.atlassian.confluence.impl.hibernate.dialect.OracleDialect.
A mistyped dialect will lead to an error, so I supposed that this is changed by a check-routine...
Is it true to say that when using a none international dialect, it will lead to create VARCHAR2 AO tables ?
Michael