Feed aggregator

index

Tom Kyte - 2 hours 15 min ago
Tom: I have a question regarding index. can you explain me in detail what the following means. If the index is a concatenation of multiple columns and one of the columns contains a NULL value, the row will be in the index column containing the NULL value and will be left empty. +++++++++++++++++++++++++++++++++++++++++++++ above is exactly what it says in a book. I can't even understand it's english."the row will be in the index column containing the NULL value and will be left empty" what does that mean?
Categories: DBA Blogs

Request for script to import data

Tom Kyte - 2 hours 15 min ago
Dear Ask TOM Team, Thank you for your appreciated efforts and providing the guidance to all concerned clients. My question: I have running and in production oracle DB, 11g ... and would like to import and update the data from other DB with the same structure and version, Please: 1- provide me the script to do such required import data to the running DB. 2- provide me the script to do such required export data to the running DB. Regards, Reda Eltayef
Categories: DBA Blogs

i want to export only the objects owned by given schema using EXPDP

Tom Kyte - 2 hours 15 min ago
Hi, i want to export only the objects owned by given schema using EXPDP, but unfortunately it is exporting all the objects which schema has access to. i am using below syntax. <code>expdp school/school@orcl schemas=(demo) directory=DATA_PUMP_DIR dumpfile=demo.dmp logfile=expdp.log</code> Please help. Thanks Fahd
Categories: DBA Blogs

Autonomous transactions and commit point optimization

Tom Kyte - 2 hours 15 min ago
Hi We have an application written with PL/SQL and thus with commit point optimization. However AWR reports show a lot of log file sync waits. How do autonomous transactions, which we are using for logging purposes operate with pl/sql code. Is the commit point optimization utilized? If some procedure is called which is defined as an autonomous transaction and no actual dml operations are done (which could happen in logging operations), is there still some commit handling done ? If so, what ? lh
Categories: DBA Blogs

How to calculate timestamp from an old ora_rowsn taking as a reference a current ora_rowscn timestamp_to_scn(sysdate) from DUAL

Tom Kyte - 2 hours 15 min ago
As you know, the association between an SCN and a timestamp when the SCN is generated is remembered by the database for a limited time period and an error is returned if the SCN specified for the argument to SCN_TO_TIMESTAMP is too old. Example: <code>select scn_to_timestamp(max(ora_rowscn)) from MY_TABLE</code> <i><b>ORA-08181: el numero especificado no es un numero de cambio del sistema valido ORA-06512: en "SYS.SCN_TO_TIMESTAMP", linea 1 08181. 00000 - "specified number is not a valid system change number" *Cause: supplied scn was beyond the bounds of a valid scn. *Action: use a valid scn.</b></i> Nevertheless, I can obtain the max(ora_rowscn) from MY_TABLE without any kind of error: <code> select max(ora_rowscn) from MY_TABLE MAX(ORA_ROWSCN) --------------- 99464620 </code> I would like to obtain the corresponding timestamp of this ORA_ROWSCN approximately (I don't care if it may vary some hours), <b>without using any AUDIT tool but using PL/SQL statements</b>. Does exist any relationship between the above MAX(ORA_ROWSCN) from MY_TABLE and the current ORA_ROWSCN from sysdate? <code> select timestamp_to_scn(sysdate), TO_CHAR(SYSDATE,'dd-mm-yyyy hh24:mi:ss') from dual TIMESTAMP_TO_SCN(SYSDATE) TO_CHAR(SYSDATE,'dd-mm-yyyy hh24:mi:ss') ------------------------- ---------------------------------------- 116631465 13-10-2021 12:12:02 </code> Could I calculate, using any kind of algorithm, the corresponding timestamp to <b>99464620</b> value? Thanks in advance.
Categories: DBA Blogs

Different sql id/sql text showing up in v$session

Tom Kyte - Wed, 2021-10-20 06:06
Hello! I have a particular ETL job that fires below 4 select queries on a view definition to our Oracle database - <code>select * from view where mod(id,4) = 1; select * from view where mod(id,4) = 2; select * from view where mod(id,4) = 3; select * from view where mod(id,4) = 0; </code> However when the queries start executing and I check v$session for the 4 sessions running these 4 queries, I see a common sql id showing up for all these 4 queries and the sql text for this sql id seems completely unfamiliar/unrelated to the queries that are actually being executed. So I am a bit lost as I have not come across such scenario earlier. What could this sql id be and why would it be showing up in v$session for all 4 different queries?
Categories: DBA Blogs

Failed to start The nginx HTTP and reverse proxy server on RedHat EC2 Linux

Pakistan's First Oracle Blog - Tue, 2021-10-19 21:55

 I needed a simple reverse proxy to redirect connections to a RDS database in a private subnet, so I quickly created a Redhat Linux EC2 instance, installed NGINX, and setup the nginx.conf file for session redirection. My nginx.conf looked like following:

user nginx;

worker_processes auto;

error_log /var/log/nginx/error.log;

pid /run/nginx.pid;

include /usr/share/nginx/modules/*.conf;

events {

    worker_connections 1024;

}

stream {

    upstream target_server {

        server targetdb:1521;

    }

    server {

        listen 1521;

        proxy_pass target_server; }

}


But starting ngnix process was giving following error:


[root@test nginx]# systemctl start nginx

Job for nginx.service failed because the control process exited with error code. See "systemctl status nginx.service" and "journalctl -xe" for details.

[root@test nginx]# systemctl status nginx.service
● nginx.service - The nginx HTTP and reverse proxy server
   Loaded: loaded (/usr/lib/systemd/system/nginx.service; enabled; vendor preset: disabled)
   Active: failed (Result: exit-code) since Wed 2021-10-20 13:40:57 AEDT; 5s ago
  Process: 14702 ExecStartPre=/usr/sbin/nginx -t (code=exited, status=1/FAILURE)
  Process: 14700 ExecStartPre=/usr/bin/rm -f /run/nginx.pid (code=exited, status=0/SUCCESS)

Oct 20 13:40:57 test systemd[1]: Starting The nginx HTTP and reverse proxy server...
Oct 20 13:40:57 test nginx[14702]: nginx: [emerg] unknown directive "stream" in /etc/nginx/nginx.conf:9
Oct 20 13:40:57 test nginx[14702]: nginx: configuration file /etc/nginx/nginx.conf test failed
Oct 20 13:40:57 test systemd[1]: nginx.service: control process exited, code=exited status=1
Oct 20 13:40:57 test systemd[1]: Failed to start The nginx HTTP and reverse proxy server.
Oct 20 13:40:57 test systemd[1]: Unit nginx.service entered failed state.
Oct 20 13:40:57 test systemd[1]: nginx.service failed.

Solution: Just install nginx-mod-stream

[root@test nginx]# ls -ltr /usr/lib/nginx/modules/ngx_stream_module.so
ls: cannot access /usr/lib/nginx/modules/ngx_stream_module.so: No such file or directory
[root@ip-10-219-40-147 nginx]# yum install nginx-mod-stream

Now if you start nginx service, it should work.
Categories: DBA Blogs

Out Parameter using the scheduler

Tom Kyte - Tue, 2021-10-19 11:46
Hi, There is no test case needed, it's more of a question. We have a stored proc which on success of the execution of the stored proc there is either a 1 or 0 returned. Can the 1 or 0 be captured by the oracle scheduler? Thanks Vic
Categories: DBA Blogs

Running Kubernetes on Oracle Cloud OCI

Andrejus Baranovski - Mon, 2021-10-18 03:42
Oracle Cloud OCI provides a good environment to run your Kubernetes workloads. In this video, I show how to access Kubernetes cluster in OCI, explain artifacts related to the cluster. I show how Skipper API runs on Kubernetes deployed on OCI. Cluster runtime is accessed through cloud shell.

 

The role of Data Lakes or Staging Areas for Data Warehouse ETL

Rittman Mead Consulting - Mon, 2021-10-18 03:21
The role of Data Lakes or Staging Areas for Data Warehouse ETL


We were asked recently by a client, about the role of a staging area for ETL processes when loading a Data Warehouse, specifically they wanted to know if this was still required if using a Data Lake.


TLDR: Data Lakes and Staging areas could be interchangeable in terms of ETL processes, the key consideration is who else and what else will make use of the data within a Data Lake and do you have the right policies and procedures in place to ensure good data governance.


As with so many things people often see Data Lakes as a technology solution, but the reality is that its is a service. Data Lakes provide a method of surfacing data in it's raw/native form to a variety of users and down stream processes, these are intented to use relatively cheap storage and to help accelerate insights into business decisions. We see clients opting to implement Data Lakes on a variety of different technolgies which have various individual benifits, drawbacks and considerations, however the previaling trend in terms of operating an effective Data Lake and in terms of controlling cost is the need for careful goverance in terms of various aspects of data quality and secuirty including items such as data retention, and data dictionary.

A staging area for a Data Warehouse serves a single focused purpose of holding raw data from source systems and providing a location for transient tables that are part of the transformation steps. Depending on the design methodology and ETL toolset the purpose of the staging area varies slightly but the target audience is always simply the ETL process and the Data Engineers who are responible for developing and maintaing the ETL. This doesn't negate the need the data governance that is required in a Data Lake but it does simplify it significantly when compared to the multitude of users and processes which may access a Data Lake.



The role of Data Lakes or Staging Areas for Data Warehouse ETLTraditional direct ETL from source system to Data Warehouse The role of Data Lakes or Staging Areas for Data Warehouse ETLData Warehousing with the inclusion of a Data Lake



CONCLUSIONS

  1. Depending on the toolset chosen for the Data Lake, ETL, and Data Warehouse the location and method for performing transformations and storing transient intimidate tables could be either in the Data Lake or within a sub schema of the Data Warehouse database.

  2. If your ETL and Data Warehouse is the only downstream user of a Data Lake is it even a Data Lake?

  3. Get your processes and policies right in terms of data governance, retention, and security.

Categories: BI & Warehousing

My Posts on Standby Database[s] -- Data Guard

Hemant K Chitale - Sun, 2021-10-17 04:10
Although I have, in previous posts, covered Data Guard in 12c (and 12cR1 RAC), this is a list of recent Blog Posts on Data Guard. 





















Categories: DBA Blogs

My Posts on Database Flashback

Hemant K Chitale - Sun, 2021-10-17 03:43

 

My Blog Posts on Database Flashback

(The first 5 demonstrations are in 11gR2.  The last demonstration is in 19c)


(not in the series)



Categories: DBA Blogs

Blue-Green Deployment in Azure

Online Apps DBA - Sun, 2021-10-17 00:48

When you are deploying a new change into production, the associated deployment should be in a predictable manner. In simple terms, this means no disruption and zero downtime! The safe strategy can be achieved by working with two identical infrastructures – the “green” environment hosting the current production and the “blue” environment with the new […]

The post Blue-Green Deployment in Azure appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

OpenShift on Azure for Beginners

Online Apps DBA - Sun, 2021-10-17 00:30

 Architecture and Features Azure Red Hat OpenShift ➪ provides fully managed and highly available OpenShift clusters on-demand that are jointly monitored and operated by Microsoft and Red Hat.  Azure Red Hat OpenShift  ➪ follows a simple architecture that depends on smaller decoupled entities called Microservices. These decoupled entities work together and run on the Kubernetes cluster. All of […]

The post OpenShift on Azure for Beginners appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Oracle Cloud Pay As You Go (PAYG) Subscription Model

Online Apps DBA - Sun, 2021-10-17 00:10

Pay As You Go (PAYG) model lets you quickly provision services with no commitment, and you’re only charged for what service you use. There is no upfront commitment and no minimum service period. Any cloud infrastructure (IaaS) and platform (PaaS) services consumed are metered and billed based on that consumption.This model Produced bills for outsourced […]

The post Oracle Cloud Pay As You Go (PAYG) Subscription Model appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

[Recap] Day 3: Manage Identity and Access in Cloud, Part- II

Online Apps DBA - Sat, 2021-10-16 23:47

With each passing day, gigabytes of data are generated, and hackers have a banquet to choose from unless we as security specialists, intervene .One quick question before we get into the third session’s cloud security reviews! Are you going to let anyone inside your home?No, no, no!This demonstrates the importance of security and managing identity and access even in the […]

The post [Recap] Day 3: Manage Identity and Access in Cloud, Part- II appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

[Recap] Day 4: Implement Platform Protection In Cloud

Online Apps DBA - Sat, 2021-10-16 23:36

If you work in IT, you’ve probably heard about cloud computing. If you work in the , you’ve probably heard of Cloud Security, and if you work in Cloud Security, you’ve probably heard of Platform Security. So it doesn’t matter if you work in IT or Sales, Cloud or Operations, security or deployment. If you are […]

The post [Recap] Day 4: Implement Platform Protection In Cloud appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

[Recap] Day 5: Implement Platform Protection In Cloud – Part 2

Online Apps DBA - Sat, 2021-10-16 23:23

How often do you feel anxious when our data is on-premise? as we are concerned about its security. But, once you’ve moved your data to the cloud, you’ll live stress-free.On this note, let’s start the recap of Day 5 of our Cloud Security course, which began with Firewalland saw to build User-defined routes using Route Table. […]

The post [Recap] Day 5: Implement Platform Protection In Cloud – Part 2 appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Basic DDL Replication with Oracle GoldenGate

DBASolved - Sat, 2021-10-16 19:40

With any type of replication configuration or replication tool, primary purpose is to move the data as transactions are committed […]

The post Basic DDL Replication with Oracle GoldenGate appeared first on DBASolved.

Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator