Feed aggregator

Establishing an initial connection can take an extraordinary amount of time.

Tom Kyte - 14 hours 45 min ago
Since the Oracle server has upgraded from Oracle 12.1 to 12.2. We notice the client's initial connection to Oracle server can take an extraordinary amount of time (up to a min). I also see it does happen on Oracle 18c server as well. Is it something relate to ORacle change their connection Encryption?
Categories: DBA Blogs

Master Detail in APEX

Tom Kyte - 14 hours 45 min ago
Hi Tom, Trying to create Master-Detail form using Stacked style with two tables in Apex 19.2. However when the app is created it's displaying full page of rows from 1st table. How to control display such that only limited rows line 5 rows of First table and displayed so the child rows of that master row can be displayed on the bottom region in the same page. thanks Rao
Categories: DBA Blogs

Node Reliability Issues

Tom Kyte - 14 hours 45 min ago
We are using Node-oracledb 4.0 driver to develop rest APIs for Oracle EBS 12.2 and 12.1 We followed the sample code snippets from Oracle Node documentation to open the connection pool, close the connection, and execute the Queries. We are using Express Server and notice some reliability issues with Node Server. We could not find out the exact issues. Just wanted to ask the following questions: 1. Are there any known reliability issues? 2. Is the Node-OracleDB driver has the capability to deploy in the Production Environment? 3. Currently we have 50 open connections in the Pool. We have some clients with 1000 concurrent users. How do we scale the connection pool to this many users? 4. Any best practices/case studies to make Node reliable? 5. Any comparisons between ISG and Node? Since we are having lot of node issues, we are thinking of developing apps using ISG. Appreciate your help, Thanks a lot. Query Execution Code: <code>const executeSQLQuery = (query, params, options = {}) => { return new Promise(async (resolve, reject) => { let conn; let throwTimeOutError = true; console.log('@@@@@ QUERY EXECUTING @@@@: ', query, params); try { console.log('connecting to pool...',oracledb.poolAlias); conn = await oracledb.getConnection('PoolConn122'); //console.log('connecting to pool...',oracledb.poolAlias); console.log('connection details:',conn); let result; setTimeout(()=>{ if(throwTimeOutError){ //console.log('came'); reject({message:'Check whether Params and PLSQL Query are valid or not'}); } },60000); result = await conn.execute(query, params); //console.log('executed'); throwTimeOutError = false; resolve(result); } catch (err) { reject(err); throwTimeOutError = false; console.log(err,'coming'); } finally { if (conn) { // conn assignment worked, need to close try { console.log('conn pool:',conn.getPool); await conn.close(); console.log('closing connection new SQL'); } catch (err) { console.log(err); } } } }); }</code> Connection Pool: <code>module.exports = { hrPool: { user: process.env.user || "xxxx", password: process.env.password || "xxxx", connectionString: process.env.connectionString || "xxxxxx", threadPoolSize: process.env.UV_THREADPOOL_SIZE || 60, poolMin: 50, poolMax: 50, poolIncrement: 0, queueTimeout:240000, poolAlias:'PoolConn122', _enableStats:true } };</code> API - Service: <code>async function getLocators(req, res, next) { let output; let query; let paramsForQuery; try { const reqParams = req.params; const reqQuery = req.query; if(null!=reqParams){ query = masterDataQueries.getLocatorsQuery20D; let model=getLocatorsModel20D; if(reqQuery.hasOwnProperty('system') && Object.is(reqQuery['system'],'CLD')){ cloudApiResult = await modelutil.callCloudApi(req.url,model); if(!cloudApiResult['error']){ res.status(200); res.send(cloudApiResult['response']); } } else{ paramsForQuery = modelutil.constructParamsForQuery(reqParams.orgId, reqParams.lastSyncTime, reqParams.isFullRefresh, 'Response'); const result = await sqlutils.executeSQLQuery(query,paramsForQuery); if(nu...
Categories: DBA Blogs

AWS X-Ray Overview, Features, and Benefits

Online Apps DBA - 15 hours 17 min ago

AWS X-Ray allows the developer to analyze and create a service map that displays an application’s architecture, including relation to components and a dependency tree. With the help of AWS X-Ray, we can understand how our application and its fundamental services are performing to identify and debug the root cause of performance issues and errors. […]

The post AWS X-Ray Overview, Features, and Benefits appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Amazon DynamoDB: Fast And Scalable NoSQL Database

Online Apps DBA - 15 hours 35 min ago

Amazon DynamoDB is a key-value and document database that delivers single-digit millisecond performance at any scale. It’s a fully managed, multi-region, multi-active, durable database with built-in security, backup and restores, and in-memory caching for internet-scale applications. Check out the blog at https://k21academy.com/awsdev14 for a better understanding. In this blog, we are going to cover one […]

The post Amazon DynamoDB: Fast And Scalable NoSQL Database appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

AWS SageMaker: Modeling With AWS Machine Learning

Online Apps DBA - 17 hours 26 min ago

AWS SageMaker has been the industry leader, offering many services that complement the entire lifecycle from data preparation to model deployment. Once you identified machine learning is the right fit for your business problem. Your next step will be to identify the right machine learning type which helps you in finding the right Model for […]

The post AWS SageMaker: Modeling With AWS Machine Learning appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Google Cloud Storage And Database Services Rundown

Online Apps DBA - 17 hours 36 min ago

Google Cloud Platform (GCP) delivers various storage and database service offerings that remove much of the burden of building and managing storage and infrastructure. Google Cloud offers 9 storage & database services. Want to know more about Google Cloud Storage & Database Services? Read the blog post at https://k21academy.com/gcp17 to learn more. The blog post […]

The post Google Cloud Storage And Database Services Rundown appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Exploratory Data Analysis with AWS Machine Learning

Online Apps DBA - 17 hours 43 min ago

According to “The State of Data Science 2020” study, exploratory data analysis (EDA), data management, feature selection, and feature engineering accounts for more than 66% of a data scientist’s time. The data analysis process involves many steps on the data, such as cleansing and transformation, to find out what hidden information the data has and […]

The post Exploratory Data Analysis with AWS Machine Learning appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Amazon Cognito Overview & Its Features

Online Apps DBA - 17 hours 54 min ago

The challenges in building applications have been around user authentication and management. Not many developers want to build yet another user identification and authentication system for their application nor would they want to cause a user to create yet another account unless needed. Amazon Cognito makes it simpler for developers to manage user identities, authentication, […]

The post Amazon Cognito Overview & Its Features appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Introduction To Application Express (APEX) in Oracle Cloud

Online Apps DBA - 18 hours 6 min ago

As per Oracle’s new update, few topics have been added in the 1Z0-998-20 version certification exam, including private endpoint, DB monitoring, and Application Express (APEX), etc. Oracle APEX is also now available as a managed cloud service, Oracle APEX Application Development service that can be accessed via cloud console. To read in brief about the […]

The post Introduction To Application Express (APEX) in Oracle Cloud appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Data Engineering with AWS Machine Learning

Online Apps DBA - 18 hours 32 min ago

Storing data for machine learning is challenging due to the varying formats and characteristics of data. Raw ingested data must first be transformed into the format necessary for downstream machine learning consumption, and once the data is ready to be used, it must be ingested from storage to the machine learning service. In this blog […]

The post Data Engineering with AWS Machine Learning appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

[Q/A] AWS Certified Solution Architect Associate Day 1: Introduction To Cloud & AWS

Online Apps DBA - Mon, 2021-02-22 23:48

In this blog, we will cover the Q/A’s from Day 1 of AWS Certified Solution Architect Associate covering Introduction To Cloud & AWS FAQs. This blog will help you to get started with AWS Solution Architect. The blog post at k21academy.com/awssa36 covers: 1. Cloud Services and Deployment Model 2. AWS Global Infrastructure Region AZ 3. […]

The post [Q/A] AWS Certified Solution Architect Associate Day 1: Introduction To Cloud & AWS appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Capturing DBMS_OUTPUT.PUT_LINE messages in the heat of a Six Nations campaign

The Anti-Kyte - Mon, 2021-02-22 15:35

It’s not a good time to be English, not in my house. That’s right, the Six Nations is in full swing and my Welsh wife and her family are once again indulging in the time-honoured practice of taunting the English.
As deployment of my emergency backup nationality ( “born In New Zealand, I’m neutral, honest !”) has failed, I’m reduced to keeping a low profile until the Brains Bitter runs out.
Fortunately, there’s excellent Wi-Fi reception in the cupboard under the stairs, so I can drown out the sound Welsh Nationalism and concentrate on the topic at hand.

Whatever language you’re coding in, however sophisticated you’re IDE, sooner or later you will likely revert to the time-hounoured debugging technique of writing out messages or variable values at runtime in an effort to find out what the heck is going on.
PL/SQL is no exception and DBMS_OUTPUT.PUT_LINE is the procedure that facilitates this practice.
However, whilst it’s a simple matter to view the messages interactively, how can you retrieve them if you are running the program as part of a batch process, where attaching a client to the session is not an option ?

WARNING – the code that follow may contain (memory) leeks…

How DBMS_OUTPUT works in SQL*Plus

To begin with, let’s find out what set serveroutput on in a SQL*Plus session actually does.
Yes, you could just have a quick look at the documentation, but you’re here now so…

alter session set tracefile_identifier = 'mike';
exec dbms_monitor.session_trace_enable(null, null, true, true);
set serveroutput on size unlimited

begin dbms_output.put_line(q'[What's occurrin' ?]'); end;
/

exec dbms_monitor.session_trace_disable;

We can find the location of the tracefile using :

select name
from v$diag_info
where name = 'Default Trace File';

To save mucking about with tkprof, I’ve just used an ftp client to retrieve the trace file from the server and then simply open it in SQLDeveloper…

You can expand the nodes in the default Tree View. In this case however, things may become clearer if we simply open the Statistics View tab :

In addition to the statments we’ve entered for tracing, we can see that
set serveroutput on size unlimited has been translated into two statements :

declare
    l_line varchar2(32767);
    l_done number;
    l_buffer varchar2(32767) := '';
    l_lengthbuffer number := 0;
    l_lengthline number := 0;
begin 
  loop 
    dbms_output.get_line( l_line, l_done ); 
    if (l_buffer is null) then 
      l_lengthbuffer := 0; 
    else 
      l_lengthbuffer := length(l_buffer); 
    end if; 
    if (l_line is null) then 
      l_lengthline := 0; 
    else 
      l_lengthline := length(l_line); 
    end if; 
  exit when l_lengthbuffer + l_lengthline > :maxbytes OR l_lengthbuffer + l_lengthline > 32767 OR l_done = 1; 
  l_buffer := l_buffer || l_line || chr(10); 
    end loop; 
    :done := l_done; 
    :buffer := l_buffer; 
    :line := l_line; 
end;

…followed by …

begin dbms_output.enable(1000000); end;
Saving DBMS_OUTPUT.PUT_LINE messages to a table

The trace file has given us a number of pointers as to how we can do this.

First of all, let’s take a look at a stored program unit whose output messages we want to capture :

create or replace procedure six_nations( i_wal_pts in number, i_eng_pts in number) 
is
begin
    dbms_output.put_line('Wales : '||i_wal_pts||', England : '||i_eng_pts);
    dbms_session.sleep(5);
    if nvl(i_wal_pts, 0) <= nvl(i_eng_pts, 0) then
        dbms_output.put_line('ERROR : User is English!');
    else
        dbms_output.put_line('Bread of Heaven!');
    end if;
    dbms_session.sleep(10);
    dbms_output.put_line('Program Complete');
end;
/

Running this from SQL*Plus…

set serverout on 
exec six_nations(9,9);
exec six_nations(9,11);
exec six_nations(15,9);

…we get…

Wales : 9, England : 9
ERROR : User is English!
Program Complete


PL/SQL procedure successfully completed.

Wales : 9, England : 11
ERROR : User is English!
Program Complete


PL/SQL procedure successfully completed.

Wales : 15, England : 9
Bread of Heaven!
Program Complete


PL/SQL procedure successfully completed.

We can write these messages to a messages table…

create table six_nations_messages (
    id number generated always as identity,
    msg_ts timestamp,
    message varchar2(4000))
/

… by using the following anonymous block to wrap our calls…

declare
    v_msg six_nations_messages.message%type;
    v_status pls_integer := 0;
begin
    dbms_output.enable;
    six_nations(9,9);
    loop
        dbms_output.get_line(v_msg, v_status);
        if v_status = 0 then
            insert into six_nations_messages(msg_ts, message)
            values(systimestamp, v_msg);
        end if;
        exit when v_status != 0;
    end loop;
end;
/

NOTE – for the sake of simplicity, this code assumes messages will not be longer than 4000 characters, although the limit for dbms_output is actually 32K.

Whilst this approach does allow us to access all of the messages, the messages are only read after the procedure finishes executing :

select * from six_nations_messages;

        ID MSG_TS                       MESSAGE                                                     
---------- ---------------------------- ------------------------------------------------------------
        21 20-FEB-21 14.38.56.037399000 Wales : 9, England : 9                                      
        22 20-FEB-21 14.38.56.042760000 ERROR : User is English!                                    
        23 20-FEB-21 14.38.56.042837000 Program Complete                                            

Multi-purpose logging routine

One alternative to peppering your code with dbms_output statements could be to amend your logging routine to output log messages to the screen if required.

A simple version of such a program might be :

create or replace procedure logit( i_msg in varchar2, i_interactive in boolean default true)
as
    pragma autonomous_transaction;
begin
    insert into six_nations_messages( msg_ts, message)
    values( systimestamp, i_msg);
    if i_interactive then
        dbms_output.put_line(i_msg);
    end if;
end;    

NOTE – once again, this code is written to keep things simple. You might consider it more realistic to have some kind of table based lookup to determine whether to output your message to the screen.

Either way, you could call this instead of dbms_output

create or replace procedure six_nations( i_wal_pts in number, i_eng_pts in number) 
is
begin
    logit('Wales : '||i_wal_pts||', England : '||i_eng_pts);
    dbms_session.sleep(5);
    if nvl(i_wal_pts, 0) <= nvl(i_eng_pts, 0) then
        logit('ERROR : User is English!');
    else
        logit('Bread of Heaven!');
    end if;
    dbms_session.sleep(10);
    logit('Program Complete');
end;
/

Now when we execute the procedure…

set serverout on
exec six_nations(21,19);

…we get the screen output…

Wales : 21, England : 19
Bread of Heaven!
Program Complete


PL/SQL procedure successfully completed.

…and also the log messages at the time they were issued by the procedure…

select * from six_nations_messages; 


        ID MSG_TS                       MESSAGE                                                     
---------- ---------------------------- ------------------------------------------------------------
        30 20-FEB-21 15.00.05.298440000 Wales : 21, England : 19                                    
        31 20-FEB-21 15.00.10.298579000 Bread of Heaven!                                            
        32 20-FEB-21 15.00.20.300390000 Program Complete                                            

Deb is threatening to do Bara brith for tea if Wales win. Fortunately, I’m quite partial to that particular Welsh delicacy. So much so that I’m off to find an inflatable daffodil and try to learn the words to “Land of My Fathers”.

Getting ORA-01404: ALTER COLUMN will make an index too large

Tom Kyte - Mon, 2021-02-22 11:26
Hi Team, I have been trying to alter the size of a column in a table which is part of a composite primary key. When I try altering the size of that particular column, I got "ORA-01404: ALTER COLUMN will make an index too large" error. Can you please help us understanding why we should get this error and how to verify the size of the index to resolve this. Thank you!
Categories: DBA Blogs

Viewing columns for Oracle private temporary tables

Tom Kyte - Mon, 2021-02-22 11:26
Once a private temporary table is created I can see it in DBA_PRIVATE_TEMP_TABLES, but how can I list its columns? I can find any information about it.
Categories: DBA Blogs

Partner Webcast – Hitchhikers Guide to Oracle Cloud (Part 3)

Oracle Cloud Infrastructure (OCI) is an IaaS that delivers on-premises, high-performance computing power to run cloud native and enterprise company’s IT workloads. OCI provides real-time...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Boto3 Dynamodb TypeError: Float types are not supported. Use Decimal types instead

Pakistan's First Oracle Blog - Mon, 2021-02-22 01:26

 I was trying to ram data into AWS dynamodb via Boto3 and the streaming failed due to following error:


  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 102, in serialize

    dynamodb_type = self._get_dynamodb_type(value)

  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 115, in _get_dynamodb_type

    elif self._is_number(value):

  File "C:\Program Files\Python37\lib\site-packages\boto3\dynamodb\types.py", line 160, in _is_number

    'Float types are not supported. Use Decimal types instead.')

TypeError: Float types are not supported. Use Decimal types instead.



I was actually getting some raw data points from cloudwatch for later analytics. These datapoints were in float format which are not supported by Dynamodb. Now instead of importing some decimal libraries or doing JSON manipulation, you can solve above with simple Python format expression like this:

"{0:.2f}".format(datapoint['Average'])

It worked like a charm afterwards. I hope that helps.
Categories: DBA Blogs

NOTE: Failed voting file relocation on diskgroup

Michael Dinh - Sat, 2021-02-20 18:32

After Oracle support performs maintenance for Exadata Cloud, the are many errors from ASM alert log.

There have been many discussions whether it’s a BUG or if alert can be ignored.

Unfortunately, BUG does not fit and cannot be ignored.

The simple solution is to check for SUCCESS after Failed as shown below.

egrep -n 'NOTE: Failed voting file relocation on diskgroup|SUCCESS: refreshed membership' /u01/app/grid/diag/asm/+asm/+ASM1/trace/alert_+ASM1.log|grep DATAC1
1937:SUCCESS: refreshed membership for 1/0xacd2391b (DATAC1)
1972:SUCCESS: refreshed membership for 1/0xacd2391b (DATAC1)
4244:NOTE: Failed voting file relocation on diskgroup DATAC1
4250:SUCCESS: refreshed membership for 1/0x8f4036a8 (DATAC1)
9876:NOTE: Failed voting file relocation on diskgroup DATAC1
9881:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
10130:NOTE: Failed voting file relocation on diskgroup DATAC1
10135:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
11112:NOTE: Failed voting file relocation on diskgroup DATAC1
11425:NOTE: Failed voting file relocation on diskgroup DATAC1
11441:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
12410:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
13318:NOTE: Failed voting file relocation on diskgroup DATAC1
13717:NOTE: Failed voting file relocation on diskgroup DATAC1
13733:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
14703:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
15566:NOTE: Failed voting file relocation on diskgroup DATAC1
15865:NOTE: Failed voting file relocation on diskgroup DATAC1
15881:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)
16836:SUCCESS: refreshed membership for 1/0x8f4038d9 (DATAC1)

Want more comfort?

$ crsctl query css votedisk
##  STATE    File Universal Id                File Name Disk group
--  -----    -----------------                --------- ---------
 1. ONLINE   1b45f107ff974f1abf34ff7b005eaff1 (/dev/exadata_quorum/QD_DATAC1_QM828NAKVNXA1) [DATAC1]
 2. ONLINE   b3607b245b984f7ebfe5188c0775c44e (/dev/exadata_quorum/QD_DATAC1_QM738NAKVNXA2) [DATAC1]
 3. ONLINE   5d26163d434a4fb6bf2cb173bae3cae1 (o/192.168.136.14;192.168.136.15/DATAC1_CD_05_phx302307exdcl07) [DATAC1]
 4. ONLINE   3e1661086fed4f8bbf080db321282b23 (o/192.168.136.16;192.168.136.17/DATAC1_CD_04_phx302307exdcl08) [DATAC1]
 5. ONLINE   edac9a7822624fe4bfd59eb357d55d95 (o/192.168.136.18;192.168.136.19/DATAC1_CD_04_phx302307exdcl09) [DATAC1]
Located 5 voting disk(s).

# ocrcheck
Status of Oracle Cluster Registry is as follows :
         Version                  :          4
         Total space (kbytes)     :     491684
         Used space (kbytes)      :      90748
         Available space (kbytes) :     400936
         ID                       : 1880339001
         Device/File Name         :    +DATAC1
                                    Device/File integrity check succeeded
                                    Device/File not configured
                                    Device/File not configured
                                    Device/File not configured
                                    Device/File not configured
         Cluster registry integrity check succeeded
         Logical corruption check succeeded

Text searching the SQL of BI Publisher Data Model contents in the OBIEE catalog

Gareth Roberts - Sat, 2021-02-20 14:05

If you're working in an environment where there are a lot of Business Intelligence Publisher (BIP) reports in an on-site OBIEE instance, sometimes you want to impact assess, or find usages of a certain database table or column. The following shell script is a very simple way to search the Data Model SQL in BI Publisher reports in the filesystem. Note of course this only works if you have an on-site OBIEE instance. For Oracle eBusiness Suite you'd be better to query the CLOBs in stored XML Publisher tables, and for Cloud based Oracle Analytics - you'll need to work that one out!

Change the CAT= like to store the location of the shared folder that you're catalog resides in.

Save the contents to a file called bipdmsearch and remember to chmod 755 bipdmsearch so you can execute it.

#!/bin/sh

if [ -z "$1" ]; then
 echo "Usage: bipdmsearch <search string>"
 exit 1
fi

if [ -z "$CAT" ]; then
 CAT=/u01/app/obi12c/config12.2.1.2/bi/bidata/service_instances/ssi/metadata/content/catalog/root/shared
fi

echo "cd $CAT"
cd $CAT

echo "Search for $1"
find . -type f -name \*xdm\* -exec grep -il "$1" {} \;

echo "Done"
This is a post from Gareth's blog at http://garethroberts.blogspot.com

... and we're back!

Gareth Roberts - Sat, 2021-02-20 13:52

Wow - it has been a long time since I posted here! So much so that the world has changed - with an onslaught of content and apps and tools and technologies and Cloud! However that said I've found myself coming back to visit my old posts from time to time as technical references to especially fiddly things are often lost in the mix of content these days. I've always had a list of things to add here, and over the last few years I've been working in-depth with the Oracle Business Intelligence Suite and picked up a bunch of tips and tricks along the way, so the list is even longer now ;-)

Although there are more people documenting more things there are still plenty of techniques that are obscure but warrant being being referenced.

So, we'll start off with a couple of easy ones, maybe something using OBIEE back end catalog updates, then kick into some tougher ones .... maybe Google Earth KML output from Oracle SQL/BI Publisher or something like that :-)

Enjoy ...

This is a post from Gareth's blog at http://garethroberts.blogspot.com

Pages

Subscribe to Oracle FAQ aggregator