#infosec issues on moving to the #cloud #DBaaS

Last week I was at Oracle Cloud World working at the ODTUG booth. This gave me the opportunity to talk to a lot of people who are seriously looking at moving their environment to the cloud. While chatting with these people, I started to pull together some thoughts on the security issues that come with moving to the cloud. Many of those issues are the same for hosting your own database applications. There are several issues with moving to the cloud and if you don’t address them it can become dark and stormy.

What security questions do you need to address prior to moving to the cloud? Note: many of these issues also applies to hosting your own databases! This subject is complex and I’m just touching on some of the issues.  If you don’t do your due diligence you will get burned.

cloud

Will and How will your data be encrypted? First off, all of your data should be encrypted by default. I am also of the <OPINION> cloud provider should not even offer to store your information unencrypted. </OPINION>. With the advent of hardware encryption modules, encryption performance is a non-issue.

There are a couple of options on encrypting your data, both have strengths and both have weaknesses. First of the easiest encryption option to implement is tablespace encryption. This option is used to encrypt all of your data stored in the tablespaces.  The down side is the data is unencrypted in the SGA.

The other option is column encryption.  This requires a bit of work upfront to setup. You are going to need to identify the atomic pieces of data that need to be encrypted then go through your indexing scheme to make sure you have not put indexes on columns that are encrypted with salt, and you don’t have foreign key constraints on columns that are encrypted. The upside of column encryption is the data stays encrypted in the SGA.

Will your backups be encrypted? Again, the answer must be yes and this is where it gets a bit tricky. RMAN backups are block level copies of the data files, so if your data is encrypted, your backups will be encrypted. However, if someone runs a datapump export of your data to refresh a lower environment and they do not specify encryption in the options, then your data will be saved unencrypted. The cloud provider must audit for this event and if it does happen, then you need to be informed and the cloud provider must make every effort to find and destroy that datapump file and any copies that have been made of it. You notice I used the word destroy as apposed to delete. Well there is good reason for that, if you delete a file there is still ghost data that can be recovered. So that or those file(s) will need to destroyed by a utility such as Linux shred.

The trusted insider attack surface has changed. <OPINION> It is safe to assume Oracle and other Tier 1 cloud providers will vet their system administrators. </OPINION> However; people change, that is just a fact of life. I frequently use the example of Edward Snowden. Prior to his leaking NSA documents he had gone through polygraph examinations, and his entire background put under a microscope, then he changed.

How will your cloud provider protect you against their trusted insider? The concept is easy, wall off your data from being seen by the system administrator. I’ve been a DBA for decades and can tell you with complete honesty, the DBA or SA does not need access to your data in order to do their job. <OPINION> Oracle has a great product Database Vault that is designed to wall off your data from the SAs. Any cloud solution should include the implementation of Database Vault. </OPINION>

Your cloud provider must provide a proven tool that protects your information from trusted insiders at the cloud provider.

Your cloud provider must also provide an integrated audit solution that tracks all audit events and allows you to report on audit events. Oracle Audit Vault comes with BI. You can use caned reports and customize those reports for your requirements.

Can you make customization’s to the security? Oracle Real Application Security (RAS) gives you, Redaction, Virtual Private Databases and audit on all connections. A full discussion of RAS is beyond the scope of this paper.

<OPIONION> At the very least, you should be able to implement, Virtual Private Databases, and Redaction to protect your data from the normal use of you applications. </OPINION> (I say normal use of you applications. Using different tools and grants it is possible to bypass these features.)

Will the cloud provider implement and configure Database Firewall. Database Firewall is a good tool to defend against sql injection attacks. It takes a lot of work to properly configure it especially if you are using a custom application. Will the cloud provider be responsible for the configuration of database firewall?

How are you going to get your data back if you decide to break up with your cloud provider?

If your cloud provider is using Oracle 12C multitennant an encryption key is generated with the container database and that is used to decrypt the encryption key for the pluggable database. I’m not going to dive too deep into this. The cloud provider can unplug your database and provide you with a set of keys to decrypt your data.

Then the worst happens, there is a data breach. You need to know, how will your cloud provider make you whole. The truth is, your customers will be upset with you and maybe your cloud provider.

End the end, you are the steward of your customers data and with that stewardship comes responsibilities.

2015 #InfoSec in review. We get a big fat “F”

We are stewards of our customers data and need to do better. <OPINION> I would give us a big fat “F” for data security in 2015.</OPINION> What happened and what needs to be improved? We saw weak passwords, lack of encryption, malware and social engineering over and over again. One very sad aspect of these attacks is once the system was compromised, the attack went on for months, even years prior to the attack being uncovered. So again we really need to do better, reading the logs, doing analytics on system behavior and locking down the data.

High level the attack vectors have not changed much over the years. Malware payloads are still being delivered by drive by downloads and infected emails. Businesses and medical groups are still leaving sensitive data unencrypted, trusted insiders can still get to sensitive information. We are also seeing encrypted connections being made to unknown servers and allowing that traffic to go through our firewalls.

I’m going to do my best to keep my opinion clear by using the <OPINION> </OPINION> tags so you know what my personal opinion is. I’m also not going to go through every attack that happened in 2015. In here I will also let you know what I think should / could have been done to mitigate the attacks.

1) IRS Data breach

In IRS’s effort to make things easy for users to access their data they exposed very sensitive tax and financial data to hackers. Over 100,000 people were compromised with this system and $50,000,000 is false tax refunds have been stolen from the US Government.

When we design systems, one of the top requirements we have are user experience. If we make it to hard to access the systems they will not be used, make it to easy and the data can be compromised. We need to weigh the value of the data with user experience. The users expect their information to be respected and protected.

2) OPM data breach

The OPM hack impacted me personally along with my wife. The impact was over 22 million people had full background and biometric information leaked to a foreign intelligence agency. I watched the congressional hearings and was very disappointed by the <OPINION>incompetence of the people </OPINION> testifying. The Director of OPM resigned but <OPINION> the CIO of OPM should have been walked out the door. </OPINION> it was her job to make sure this information was secure. I still don’t know why this information was not stored on the classified network as it should have been. <OPINION> As an added insult, the government is offering two years of credit monitoring. As if a foreign intelligence agency is really interested in taking out credit cards in our names. The big threat is we are now at risk for blackmail. </OPINION>

The OPM breach was malware that was making encrypted connections to unknown servers. This is a case where black listing IP’s would not work, but white listing connections would work. Sensitive data should only be transmitted over trusted paths and <OPINION> if encrypted connections are being made, then those connections should be treated as sensitive. </OPINION>

3) UCLA Medical

UCLA Medical lost 4.5 million records of unencrypted patient data including PII and medical information. There is no excuse to not encrypt sensitive data. I still hear the old excuse of there is a performance impact of encryption. With the availability of hardware encryption modules, this argument does not hold water.

After encrypting data, we still have to be careful about ghost data and data leakage. A DBA can still run database pump and get an unencrypted copy of the data then copy that data to another location. We do this all the time to refresh an environment. Controls need to be placed on data pump copies so any information that is exported from the database will stay encrypted and the location of those copies are known. When moving data from unencrypted to encrypted, all ghost data must be shredded.

4) Ashley Madison

This one did not really interest me very much other then the disrespect Ashley Madison showed their customer base. This hack ruined some reputations and exposed a large number of people to blackmail. Yes credit card numbers were encrypted, but geolocation and email addresses was not encrypted. <OPINION> The large number of people who used their work and government email addresses was shocking. These people who are so blind to opsec deserve to be caught. </OPINION>

5) Hyatt

Just recently we learned about the Hyatt payment processing data breach. Not much is known at this time other then malware sent encrypted data to an unknown server. This is yet another case of needing to have a trusted path for sensitive data by using white list and denying access to any unknown IP address.

6) Trump Hotels

Trump Hotels, in a year long campaign, credit card and security code information was stolen from customers of Trump properties. I’m going to keep beating this drum, you need a trusted path from point of sales to the processing database, so <OPINION> implement white lists and deny any encrypted traffic to unknown ip’s.</OPINION>

7) T-Mobile and Experian

T- Mobile placed their trust in Experian and suffered a massive breach of 15 million customers full name, social security number and date of birth and some passport numbers. In this case no payment card data was compromised. Yet this is still enough information for identity theft. Not a lot of information has been provided on the attack vector used.

In December 2013 T-Mobile suffered another data breach with vendor Decisioning Solutions that is owned by Experian. In both of these cases, T-Mobile is offering credit monitoring through ProtectMyID that is owned by Experian. <OPINION> Why does T-Mobile continue doing business with Experian? </OPINION>

This is not an exhaustive list of breaches for 2015.

8) VTECH

VTECH the toy manufacturer exposed data on 4.8 million customers due to password insecurity.

9) Securus

Securus lost 70 million call logs and recorded conversations of people in prison. These recordings also included attorney client privileged conversations.

10) FBI

The FBI LEO Portal was hacked, the attack vector and damage is still classified.

11) Scott Trade

Scott Trade lost data on 4.6 million customers under a two year campaign. Krebs on Security reported that the data was used for stock scams.

12) Excellus Blue Cross Blue Shield.

Excellus Blue Cross Blue Shield lost data on 10 million customers. The attack started in 2013 and was not discovered until 2015.

13) Anthem

Anthem lost data on 78.8 million customers. I have read the count was actually 80 million customers and 19 million rejected customers.

14) Anonymous vs ISIS.

I only add this because of the interest in ISIS. After the Paris attacks Anonymous started OpParis that is turning into a interesting game of wack a mole. Anonymous is using brute force to shut down ISIS controlled accounts and servers. The results are debatable, <OPINION> it would be better to allow some of the systems to stay online to gather intelligence on ISIS. By shutting them down you are forcing them onto the dark web where it’s harder to gather intelligence.</OPINION>

<OPINION> Sadly, many times after a breach the offending company offers one year or two years of credit monitoring. The customer will be exposed for the rest of their life. Two yours of credit monitoring is wholly inadequate./OPINION>

What do we need to do.

  1. Secure the data. Encrypt data at rest so if the data is compromised then it will be useless to the criminal.
  2. Encrypt the data on the network when there is sensitive data going through it. Man in the middle attacks happen.
  3. Build trusted paths for sensitive information. All sensitive information must go through that path.  If an encrypted session is being built to an unknown server, deny that connection.
  4. Secure the parameter. We are letting encrypted traffic go to unknown servers. This has to stop by using white list. If a workstation or node can process sensitive data, then that workstation or node should not be able to access unknown servers.
  5. Secure programming practices. I still see first hand sloppy programming that is vulnerable to sql injection. Organizations must impalement secure coding practices with code reviews that also include looking to vulnerability. A couple months ago, I came across a piece of code that was vulnerable to sql injection, when I brought it up to program management I was told, going back to fix the problem would put the program behind schedule, move forward and we will fix it after going production. <OPINION> This is the wrong attitude. </OPINION>If the program had standards in place before coding started, then the problem would not have gotten as far as it did.
  6. Secure the data from trusted insiders. I wont get into the political issues of Bradly Manning or Edward Snowden. Both of them were vetted and had access to sensitive information, they broke their trust and stole information that did incalculable damage.
  7. Routinely review audit logs to look for unusual behavior. I’m still seeing audit logs get ignored until there is a problem. Products like Oracle Audit Vault, brings all of your audit into one package where you can create BI dashboards to find out when something is happening that is outside of the norm.

#infosec #LetsSecureThisTogether This is going to piss some people off. The C suite needs to have this conversation.

Get the words best practice out of your vocabulary.” I have been at many customer sites that needed my expertise, and someone says to me in a meeting, “well what is the best practice to secure our information.” I’m going to tell you right now, the bad guys are reading the same best practice white papers and poking holes through them left and right. In addition, that audit report you received saying you are in compliance with <state your regulation> may be factually correct at that moment in time, but your information is still not secure.

For each of your systems, bring five of your senior system administrators into a room and ask them a simple question. “How would you compromise your system?” Then sit back and listen. If they are good inside of thirty minutes you are going to start hearing things that will scare you. Let me give you some examples from my life.

We are generating audit reports but no one is actually reviewing them. At one customer, I generated audit reports that showed invalid logins’, ip address and username logins when the the connections were simultaneous, and a host of other audit reports. I then went to our security people and asked them how often they wanted this report, daily, or weekly. I was told to review them and only bring things up to them when I find something interesting. Well for one, under the concept of separation of duties your SA should not have sole responsibility for reviewing audit logs. Yes, they should but a copy needs to be sent out for review. As the Oracle DBA I can logon as oracle or logon as sys, I am god. I can do almost anything I want and cover my tracks. Can you spell Edward Snowden? Can you say with complete confidence that you do not have an Edward Snowden in your shop?

The patch schedule is to drawn out. One shop we were twelve months behind in our Oracle CPU patches because of the perception that patching the database would impact the development schedule. This is one of those times where I got maybe a little bit forceful in a meeting and pushed for getting everything patched. But the customer was adamant, “you will not do anything that will slowdown development and testing.” We finally got everything patched when it was announced there was going to be an audit. The audit happened, and there was no findings. Our immediate management was happy, me, I was not so happy. When a patch comes out, the bad guys are reading what is being fixed and they are quite adapt at exploiting the weaknesses that are being patched.

A webserver that is miss-configured, I walked into one customer and as I was setting up my audit scripts, I noticed there were over a thousand invalid login attempts from a handful of webservers every thirty minutes. The DBAs were not talking to the webserver SA’s and the webserver SA’s would allow the invalid login attempts to continue if the application that webserver supported was no longer in use and again, no one was actually reviewing the audit logs.  In fact this was a known issue that was explained as “normal” to the security group. This is the perfect way to hide password cracking attempts. One of the webservers that was in the DMZ had not been used for over a six months, but was still running. When I dug into the audit trail to see what was actually going on, I found several attempts to connect to the database as sys, system, admin, sa, root and a host of other attempts. That server had been compromised along with a few others. When I brought the evidence up to management they were shocked. Finally passwords were changed on the webservers and those webservers that were no longer in use were pulled from the network and a complete scan of the network was completed.

Excessive privileges to developers and developers using a common development account. Yup, this still happens. I walk into a shop, and given the application username password to do all my work. Folks your audit trail is now toast. At that same shop all developers have the passwords for sys and system. I never got a good reason for this. NO ONE should ever logon at sys. Application accounts should never be used for development.

This is just a small fraction of the issues I have seen in different shops. 

All of these shops either followed “best practices” or modified their practice when the learned there was going to be an audit. Everyone of those shops are staffed with professional SA’s who can tell you where the weakness is. And every shop I have ever worked in has weaknesses. Your job in the C suite is to ask these professional SA’s to come into a meeting and have a safe and secure conversation on how they would crack the system. If they can not come up with anything you are either working at NSA, they are trying to hide something or you need smarter SA’s.

And don’t just do this once and say, we’re done, schedule these meetings either quarterly or semiannually. This is a conversation between the C suite and your professional SA’s. You need to understand where your risk are, these people are real smart and if you listen to them they can tell you what’s wrong and what needs to be done to fix it.

Once you have this information, give your professional SA’s and their management the tools and resources they need to close these security holes.

#Oracle #TDE Ghost Data Teaser

Here is a teaser for the Oracle Transparent Data Encryption presentation

We look at having an existing table with existing indexes. A policy comes out that says we need to encrypt SSN and Credit Card Numbers. Once we encrypt the columns and rebuild the indexes, does the unencrytped data in the index get encrypted?

Watch and find out.

Oracle Transparent Data Encryption Baseline Practices webinar

I will be giving a webinar on Oracle Transparant Data Encryption Baseline Practices August 27, 2015 at 3PM. Sponsored by @odtug 

Why “Baseline Practices?” well best practices does not seem to be working so we are going to start improving “your game” by setting the baseline and getting you to think how information is secured.

This one-hour presentation includes how to, gotchas (where data can leak in clear text), and baseline practices for implementing Oracle Transparent Data Encryption. We will walk through setting up Oracle Transparent Data Encryption and establish baseline practices to secure your information, explaining where and how to encrypt the data, and where and how data leaks in plain text and HEX format. We will also explore these questions: When sending data to a standby database, when does the data stay encrypted and when can it transfer over the network in clear text? When using Oracle Streams, does data go across the network in clear text? When gathering statistics on encrypted data, where can the data be exposed unencrypted? As we discuss each of the leaks, we will also review how to mitigate the leakage and eliminate ghost data.

Register here: https://attendee.gotowebinar.com/register/7938280806383993602

#Oracle #TDE #dataleak #Histograms

While at #KSCOPE15, I was asked about encrypted data showing up in histograms.  So, I ran a few experiments and learned that encrypted data does indeed leak. I contacted Oracle through an old friend to get their input.

Here is the reply I received.

================================================================

The attack vector here is via SELECT on histogram tables. The protection profile for TDE by design does not address DB users with proper access to such tables. The gotcha here is that many people don’t realize they should control access to STATS tables as well as the tables with sensitive data.

Possible ways to workaround:

1. Configure Database Vault policy so user who tries to query sensitive columns on these views/tables is denied access

2. Do not collect stats on sensitive columns in the first place

===================================================================

Here is my experiment:

1 Test 1) Note, we are putting this into a non-encrypted tablespace, we are going to explicitly encrypt column D1. 2 3 --CREATE A TEST TABLE 4 RLOCKARD@pdev > CREATE TABLE t1 (id NUMBER GENERATED AS IDENTITY, d1 VARCHAR2(255)) TABLESPACE not_sensitive; 5 Table created. 6 7 -- ENCRYPT THE DATA 8 RLOCKARD@pdev > alter table t1 modify (d1 encrypt using 'AES256'); 9 Table altered. 10 11 --INSERT SOME TEST DATA 12 RLOCKARD@pdev > insert into t1 (D1) (select 'Encrypt your data' from dual connect by level <= 10); 13 10 rows created. 14 RLOCKARD@pdev > insert into t1 (D1) (select 'Is this encrypted?' from dual connect by level <= 5); 15 5 rows created. 16 RLOCKARD@pdev > insert into t1 (D1) (select 'Practice Secure Computing' from dual connect by level <= 20); 17 20 rows created. 18 RLOCKARD@pdev > commit; 19 Commit complete. 20 21 -- GATHER STATISTICS ALONG WITH HISTOGRAMS. 22 RLOCKARD@pdev > begin 23 dbms_stats.gather_table_stats(null,'T1', method_opt=> 'for all columns size skewonly'); 24 end; 25 / 26 PL/SQL procedure successfully completed. 27 -- THIS LOOKS GOOD 28 RLOCKARD@pdev > select 29 endpoint_number, 30 endpoint_actual_value 31 from dba_tab_histograms 32 where owner = 'RLOCKARD' 33 and table_name = 'T1' 34 and column_name = 'D1'; 35 2 3 4 5 6 7 36 ENDPOINT_NUMBER ENDPOINT_ACTUAL_VALUE 37 --------------- -------------------------------- 38 10 39 15 40 35 41 42 -- HOWEVER, WHEN WE DIG A BIT FURTHER IT'S QUITE EASY TO 43 -- TRANSLATE ENDPOINT_VALUE INTO THE FIRST CHARACTERS OF THE 44 -- DATA THEREBY EXPOSING THE INFORMATION. 45 -- NOTE THIS QUERY IS FROM Jonathan Lewis blog at: https://jonathanlewis.wordpress.com/category/oracle/statistics/histograms/ 46 47 48 RLOCKARD@pdev > ed 49 Wrote file afiedt.buf 50 51 1 select 52 2 endpoint_number, 53 3 endpoint_number - nvl(prev_endpoint,0) frequency, 54 4 hex_val, 55 5 chr(to_number(substr(hex_val, 2,2),'XX')) || 56 6 chr(to_number(substr(hex_val, 4,2),'XX')) || 57 7 chr(to_number(substr(hex_val, 6,2),'XX')) || 58 8 chr(to_number(substr(hex_val, 8,2),'XX')) || 59 9 chr(to_number(substr(hex_val,10,2),'XX')) || 60 10 chr(to_number(substr(hex_val,12,2),'XX')) || 61 11 chr(to_number(substr(hex_val,14,2),'XX')) || 62 12 chr(to_number(substr(hex_val,16,2),'XX')), 63 13 endpoint_actual_value 64 14 from ( 65 15 select 66 16 endpoint_number, 67 17 lag(endpoint_number,1) over( 68 18 order by endpoint_number 69 19 ) prev_endpoint, 70 20 to_char(endpoint_value,'XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX')hex_val, 71 21 endpoint_actual_value 72 22 from 73 23 dba_tab_histograms 74 24 WHERE 75 25 owner = 'RLOCKARD' 76 26 AND table_name = 'T1' 77 27 and column_name = 'D1' 78 28 ) 79 29 order by 80 30* endpoint_number 81 31 / 82 83 ENDPOINT_NUMBER FREQUENCY HEX_VAL CHR(TO_N ENDPOINT_ACTUAL_VALUE 84 --------------- ------------ ------------------------------- -------- -------------------------------- 85 10 10 456E6372797079E93CBEA1A5000000 Encrypye 86 15 5 49732074686967A04440BE12C00000 Is thig 87 35 20 5072616374698217A0D44499800000 Practi? 88 89 3 rows selected. 90 91 92 93 94 TEST 2) 95 Important note: THIS IS ALL PSEUDO DATA, NOTING IS REAL. 96 97 -- the test customers table contains pseudo ssn's and cc numbers for demo purposes. 98 -- reality is, because cc_nbr and ssn are distinct, histograms should not be gathered, 99 -- however a "lazy" DBA may use the 'for all columns size skewonly' method_opt 100 -- therefore, by using the defaults you will get out 254 rows with data that should be encrypted. 101 102 create table t3 as select id, fname, lname, city, state, cc_nbr, ssn from customers; 103 alter table t3 modify (cc_nbr encrypt using 'AES256', SSN encrypt using 'AES256'); 104 105 begin 106 dbms_stats.gather_table_stats(null,'T3', method_opt=> 'for all columns size skewonly'); 107 end; 108 / 109 110 desc t3 111 112 RLOCKARD@pdev > desc t3 113 Name Null? Type 114 ------------------------------------------------------------------------ -------- ------------------------------------------------- 115 ID NOT NULL NUMBER 116 FNAME VARCHAR2(25) 117 LNAME VARCHAR2(25) 118 CITY VARCHAR2(25) 119 STATE VARCHAR2(25) 120 CC_NBR VARCHAR2(16) ENCRYPT 121 SSN VARCHAR2(11) ENCRYPT 122 123 RLOCKARD@pdev > 124 125 126 select 127 endpoint_number, 128 endpoint_actual_value 129 from dba_tab_histograms 130 where owner = 'RLOCKARD' 131 and table_name = 'T3' 132 and column_name = 'SSN'; 133 134 RLOCKARD@pdev > l 135 1 select 136 2 endpoint_number, 137 3 endpoint_actual_value 138 4 from dba_tab_histograms 139 5 where owner = 'RLOCKARD' 140 6 and table_name = 'T3' 141 7* and column_name = 'SSN' 142 RLOCKARD@pdev > / 143 144 ENDPOINT_NUMBER ENDPOINT_ACTUAL_VALUE 145 --------------- -------------------------------- 146 4247 778294598 147 4269 782777484 148 4291 785731383 149 4313 788768328 150 4335 792928354 151 4357 795685465 152 4379 798987731 153 4401 812732627 154 4424 815857391 155 4446 818188243 156 ========SNIP A LOT====== 157 158 RLOCKARD@pdev > SELECT * FROM T3 WHERE SSN='778294598'; 159 160 ID FNAME LNAME CITY STATE CC_NBR 161 ---------- ------------------------- ------------------------- ------------------------- ------------------------- ---------------- 162 SSN 163 ----------- 164 41742 Monica Gaestel Strattanville Pennsylvania 3483712444144721 165 778294598 166 167 168 1 row selected.