1. Trang chủ
  2. » Công Nghệ Thông Tin

Applied Oracle Security: Developing Secure Database and Middleware Environments- P9 pot

10 380 0
Tài liệu đã được kiểm tra trùng lặp

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 10
Dung lượng 236,13 KB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Version 11g introduced tablespace encryption, which makes it possible to protect entire swaths of data stored in database servers and removes many of the restrictions faced by column-le

Trang 1

54 Part I: Oracle Database Security New Features

systems in an organization that use keys In an HSM device, the keys are processed using specialty hardware that offers speedy cryptographic processing and network access, and handles keys in a

tamper-resistant platform that generates keys within the device, so file system copies never exist

Keys themselves are never removed from the device in an unencrypted form, and the table keys are sent to the HSM device for decryption over a secure channel Since such keys are becoming increasingly critical for organizations and are being used in more applications each year, building

a key management strategy early in a cryptography program can be a very sound investment Secondly, most networked HSM devices provide a central mechanism for managing who has access to which keys This provides capabilities around separation-of-duties and span-of-control, each examined in any system security audit Since the keys in question can provide such things as the functionality of TDE and the security for your SSL-protected communications, they provide an extra layer of defense for key storage Currently supported HSM vendors include the following:

SafeNet (acquired Ingrian); LUNA SA version 4.2

Ingrian DataSecure Platform (models i116, i421, and i426, as well as previous models i110, i311, and i321)

nCipher (acquired certain assets from NeoScale) netHSM 500, 2000; nShield PCI 500,

2000, and 4000

NetApp OpenKey Client with PKCS#11 API for the LKM-Appliance ‘KM-500’

RSA RKM for the Datacenter Module for Oracle 11g TDE

Thales e-Security (acquired nCipher)

nuBridges

Utimaco

Including a vendor-specific library to the host operating system accomplishes the

implementation of an HSM device.:

/opt/oracle/extapi/32/hsm/<HSM_VENDOR_NAME>/<VERSION>/libpkcs11.so

The directory (/opt/oracle) is important: it must exist (create one if it doesn’t, and make sure that the file ownership is “oracle” and filesystem permissions allow read and write access to the directory, or set it up as specified by the HSM vendor’s guidelines) It will then be possible to configure the database to use this operating system device to store keys in a PKCS #11 keystore

By changing the setting in the SQLNET.ora file, the HSM device can be specified as a METHOD

for storing the Master Key

# sqlnet.ora Network Configuration File:

/home/oracle/product/11.1.0/db_1/network/admin/sqlnet.ora

# Generated by Oracle configuration tools.

NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)

ENCRYPTION_WALLET_LOCATION =

(SOURCE=

Trang 2

(METHOD_DATA=

(DIRECTORY=/home/oracle/product/11.1.0/admin/aos/encryption_wallet)

)

)

Notice that the directory still points to the file-based encryption wallet At this point, a security administrator can migrate the Master Keys used for column-based encryption:

ALTER SYSTEM SET ENCRYPTION KEY IDENTIFIED BY "HSMusername/HSMpassword"

MIGRATE USING "<existing software wallet password>"

This tells the database to reencrypt all TDE column keys with the newly created, HSM-managed, Master Key As the HSM method is specified in the SQLNET.ora file, communication to the HSM device using the API provided in the vendor-provided library is possible

Use of an HSM device for Master Key storage is possible only with column-level TDE in the current 11.0.6 release Tablespace encryption still requires the Master Key to exist in the Oracle Wallet Using both column-based encryption with keys stored in an HSM device while allowing tablespace encryption is possible, but it uses a file-based Oracle Wallet for tablespace encryption

in the 11.0 release by doing the following:

"ALTER SYSTEM SET ENCRYPTION KEY IDENTIFIED BY "HSMusername/HSMpassword"

MIGRATE USING "<existing wallet password>".

By using the same password for the wallet and the HSM, and not removing the (DIRECTORY=/ home/oracle/product/11.1.0/admin/aos/encryption_wallet) syntax, an OPEN WALLET command

will make both the Master Key for columns (stored in the HSM device) and the Master Key for tablespaces (stored in the wallet)

The next release of TDE, 11.0.7, will allow the creation of the tablespace Master Key directly

in the HSM (it will not need to be migrated when creating for the first time)

By opening Oracle Wallet Manager using the initial wallet password (from the tablespace encryption example) and then selecting the Auto-Open option, you can re-save the wallet as an auto-open wallet that exists on the filesystem as a cwallet.sso At this point, you can encrypt moderately sensitive data by storing it in tablespace encrypted using TDE’s tablespace feature Since the wallet is set to auto-open, when the database restarts, any database stored in the tablespaces is available; however, it is still stored in an encrypted form on the drives and any backup media Very sensitive data (credit card PANs, SSNs, and so on), however, is protected by encrypting the columns, with the keys for this data encrypted using the Master Key stored in the HSM device A security administrator is then responsible for opening and managing the HSM

device-stored keys (ALTER SYSTEM SET WALLET OPEN IDENTIFIED BY “HSMusername:

HSMpassword”).

Summary

Encryption is quickly moving from the wish list to the must-have list as organizations are faced with tighter internal and external security standards The ability for organizations to prove that they are doing enough to protect data assets is enhanced with the addition of encryption Transparent data encryption helps provide greater security for sensitive data throughout the data life cycle by encrypting data as it is written to disk and decrypting it as needed The Master Key needed to perform encryption/decryption operations may be controlled by a non-DBA user, such as a security

Trang 3

56 Part I: Oracle Database Security New Features

officer/administrator Having this separation of duties is important not only because it is mandated

by nearly every security target, but because it helps to ensure that data is protected in such a way that both fraud and collusion must occur to carry out particular actions

The use of column-level TDE provided in 10g R2 provides strong encryption of sensitive

columns, but it has limitations regarding datatypes and additional database capabilities It can use

a HSM device to secure the Master Key and generally the performance impact is fairly low It can provide both encryption capabilities and separates the administrative duties for encryption key management

Version 11g introduced tablespace encryption, which makes it possible to protect entire swaths

of data stored in database servers and removes many of the restrictions faced by column-level encryption Tablespace encryption provides the most transparent mechanism for protecting data

in both new and existing applications The performance impact of either column or tablespace encryption is minimal (average impact is single digits in most test scenarios for non-indexed encrypted values), and implementation can be done as databases are upgraded and tested in the

11g environment, which, as you find in future chapters, provides a number of security features

that allow you to strengthen the security of your database, applications, and the movement of data between them

Trang 4

Applied Auditing and

Audit Vault

57

Trang 5

58 Part I: Oracle Database Security New Features

n the security life cycle, we think of the phases as protect, detect, respond, and remediate Most of the security time and attention is focused on the protect phase However, we cannot always protect everything from everyone or everyone from everything This leads

us naturally to the detect phase Sitting squarely at the center of detect is auditing

If you conducted a survey of security professionals, you would probably find that auditing is considered one of the least exciting areas of security However, security professionals know they must address auditing at some level, but it rarely gets top attention for several reasons Some don’t know what to audit; some don’t know how to audit; some don’t know what to do with the audit records once generated; and some believe the auditing is “overhead” and a

performance penalty that doesn’t justify the additional resources that may be required While these reasons have some legitimacy, at the end of the day, they are nothing more than excuses for not doing something that should be done

In this chapter, we look at how to apply effective auditing techniques The discussion will highlight the value that auditing provides We also explore Oracle Audit Vault’s basic architecture and components and review its various options and capabilities

An excellent primer for this chapter is in Chapter 8, “Effective Auditing for Accountability,”

in Effective Oracle Database 10g Security By Design That text adequately covers how to enable

auditing, and the differences in writing your own auditing, using standard auditing, and using fine-grained auditing Many concepts and principles described in that book also apply here Our goal is to consider auditing beyond a single database instance The challenge, and to an extent even within a single database, is to join and relate all the information you can capture— SYS operations, Data Definition Language (DDL), Data Manipulation Language (DML), selective auditing, and even the data value changes themselves You’ll see how the audit records will show you the answers to the who, what, where, when, and how of data access You will see that effective auditing across your databases is not only possible, but invaluable

An Era of Governance

With the new era of Governance, Risk Management, and Compliance (GRC) firmly upon us, security and auditing excuses are quickly evaporating As the custodians of data, IT professionals are being asked to protect personally identifiable information (PII), personal health information (PHI), and an assortment of other valuable data such as credit card numbers or bank account information used in electronic banking transactions We must now comply with the cadre of new legal requirements: Sarbanes-Oxley (SOX), state privacy laws, the Payment Card Industry’s Data Security Standard, EU privacy directives, Corporate Law Economic Reform Program Act (CLERP9), and Basel II to name just a few

In this emerging GRC and privacy era, these requirements are just that—requirements They are no longer optional and are now part of company business plans They are not considered extra overhead, or a luxury item—that is, if we have spare time, money, and computing cycles, then we’ll audit Some consider auditing a tax, but as with all taxes, they still have to be paid

An emerging view of auditing is now gaining popularity, however From a GRC perspective, auditing provides an additional layer of oversight to existing access and physical controls You can think of auditing as a safety net that catches things that fall through the protection controls In these instances, while you cannot prevent a security compromise (because if you could have, you would have), you can detect them when they occur This allows you to respond to and remediate any data breaches or other negative events The timeliness of the detection and response is critical and crucial to preserving public image and setting up prevention techniques that will protect against future attacks

I

Trang 6

Auditing for Nonsecurity Reasons

In some cases, auditing is set up to detect not only bad things that might occur but to act as an overall system monitor Auditing can tell us who has accessed what, from where, when, and how Another benefit of capturing audit data is that we can use this intelligence in considering overall enterprise IT issues such as resource distribution, scalability requirements, and underused and overused resources This last point is also important to GRC, and many people are starting to understand that auditing offers real value—not only in the event of a security breach but in all cases

TIP

Effective auditing is not just about security; it has added value for

system and application profiling.

You can even use auditing to identify when and if certain business conditions have been reached For example, you could use audit records to detect and alert you when big financial transactions occur This would be particularly beneficial when the transaction spans multiple databases or is not part of a single database transaction

With a consolidated auditing system, aggregate counts and group-by queries on the audit trail can show you patterns of interest For example, you can identify facts and frequencies on the following:

Server accesses

Network paths (client to server)

Which users are using which databases or a particular database of interest

Time factors (of year, of week, of day)

Types of transactions or facts that tell you who is using data (sets) X, Y, and Z

We’ll explore these and other examples more thoroughly later in the chapter Oracle Audit

Vault, introduced in Oracle Database 10g Release 2, makes major strides toward these goals by

allowing you to collect audit data from all Oracle databases within your environment as well as other sources such as DB2 Universal Database (UDB), Sybase ASE, and Microsoft SQL Server (with more sources planned for future releases of Audit Vault.)

The Audit Data Warehouse

In many discussions with customers throughout the years, we have observed a common pattern in the area of auditing These discussions were not focused on the tactical facets of turning auditing

on or off (we refer to that as “micro auditing.”) Rather, they concerned the need to have a holistic view about what was going on, from an auditing perspective, everywhere in the enterprise We call this perspective to auditing “macro auditing.” If you looked through all the objectives and requirements, you would conclude that everyone was essentially trying to build the same thing

We found that people were asking for a effective way to build what could be described as an

audit data warehouse.

Understanding the design intent is always critical to a complete understanding of why you should use it, when you should use it, and how you should use it With Audit Vault, the design intent was to create a product that captured the requirements of what everyone was already building or trying to build on their own Note that Oracle security products have largely become

Trang 7

60 Part I: Oracle Database Security New Features

“products” through this principle of identifying a common set of customer requirements and then solving those requirements through a professionally developed product

The benefit to productizing common implementation patterns is that engineering changes

to the core products are often required and are essential to meeting the intended requirements Through the product creation process, those changes can be made Product optimizations, such

as better methods for storing and retrieving metadata, can also be accomplished as well as gaining and sharing best practices This is precisely what occurred with Audit Vault

As we explore the needs for an Audit Vault, we must define the requirements in terms of high-level objectives This is important because technical implementations or technical features may

be underutilized or not used at all if these objectives and requirements are not fully understood Many say a far worse scenario exists when one undergoes the custom buildout of an application that matches an existing product—that is, building an Audit Vault from scratch Again, this often occurs because of a lack of understanding about how to connect technology features to high-level organizational and application objectives

Audit Warehouse Objectives

Let’s look at the intent and requirements we have considered with respect to macro auditing and creating an audit warehouse

Consolidated View Across the Enterprise

Our first key objective to explore is how to get a cross-enterprise view of our audits As you know, GRC is a primary driver, as is looking for ways to optimize the IT infrastructure If you think about what is required to create a consolidated view, you can map it easily to other systems For example, organizations typically want to aggregate and analyze key financial data Organizations also need to aggregate and analyze human resources (HR) information (for example, to be able to answer the question, “How many employees do we have?”) Other operational systems such as sales databases and facility databases (databases that maintain all the building and capital assets) also require that the information be gathered, and analyzed using sophisticated reporting and analysis tools

This is accomplished today through the multibillion-dollar technology field called Business Intelligence (BI) BI is in fact such a huge value to organizations that the success or failure of an organization is largely dependent on how much insight it can glean from its business operations Being able to identify suboptimal expenses and operations and then act to rectify them quickly can be the difference between a growing and profitable organization and one that does not grow

or profit

In a gross oversimplification, you could say that doing BI involves two key elements: First, you need to get all the data, and second, you need to analyze it The multibillion-dollar industry exists because both elements are quite complex and require sophisticated technology to accomplish them correctly and efficiently As a testament to the maturity and importance of BI, consider that Oracle Corporation has been actively working in the BI field for almost 20 years and has developed and acquired an extensive portfolio of tools and technologies that can be uses to construct a cross-enterprise BI capability

Two critical points should be noted: BI is essential to the success of every organization and effectively implementing it requires advanced tools, technologies, and techniques And getting intelligence from the data is nontrivial, and no one should embark on a project to re-create BI tools, technologies, and techniques

Trang 8

Data Warehouse Recall the first objective of BI is getting all the data, which is commonly

accomplished via a data warehouse A data warehouse is simply a single database that acts as the resting point for all the collected data “Warehouse” is a very appropriate term, as in many ways a data warehouse has to replicate a real-world warehouse It must be organized; products or information must be able to flow in and out of the warehouse efficiently; and it generally contains

a lot of stuff

Implementing a data warehouse involves three basic steps: Extract the data Transform the data Load the data

Extracting the data means that you must pull the data from the place in which it resides For Oracle database auditing, this translates to replicating (in a figurative sense) the audit logs from the databases of interest Across an enterprise, this would mean finding a way to extract all the auditing and logging data

Data transformation means the data is put into a form that is consistent in meaning and often

in format From an auditing perspective, this could mean being able to define and identify users, applications, and server consistently s You can imagine the confusion if something as basic as a user’s identity is not represented in a meaningful or consistent way This would likely result in an inability to correlate audit records from different applications and databases

Data loading addresses the other end of the data extraction process, which is to load or fill up the warehouse An infrastructure that allows large amounts of data to be loaded quickly and easily

is required Also required is a data model that includes such things as indexes and partitions This all must be done using technology that supports the vast amount of data that will ultimately be loaded

For enterprise auditing, you can now see the huge similarities You would probably predict that a lot of auditing records would exist across the enterprise You would also probably predict that an infrastructure would need to be in place to collect all the data across the enterprise You would also identify the need to transform the data into a common description to understand what

is really happening

Once the information is loaded, you need the second BI element: analysis tools

BI Tools To some, nothing is scarier than having to create useful and summarized information

from terabytes of data Successfully accomplishing this requires quite a few issues, including the ability to find, correlate, aggregate, and disseminate the information

Each of these tasks takes time and effort to do well For auditing, we need to search through the warehouse to find individual users, tables, applications, and so forth We need to be able to relate the information across the various pieces of the warehouse

For example, if we want to understand every task a user performed, we may have to correlate their authentication with their application invocation down to the actual database SQL or PL/SQL that resulted from the application interactions Another valuable piece of information could be derived by simply being able to see what data a user manipulated across several core databases (or by way of association, the applications the user accessed.)

As you may gather, you’ll want to be able to obtain quite a few legitimate nuggets of information from the audit data you collect What we have learned about BI can be applied here once again: You need to know a lot of information and you will need advanced tools to discover, correlate, and present this information BI products, especially Oracle’s BI Suite, offer the off-the-shelf solution for culling the meaningful information from the reams of data you collect

This is an important fact for you to internalize Most custom-built BI applications do a tenth of what a commercial product would do, and the overall operations and maintenance costs tend to

Trang 9

62 Part I: Oracle Database Security New Features

run ten times that of the product and support fees they would have otherwise paid Therefore, the best practice for enterprise auditing is to use commercial technologies to perform BI You’ll want

to create a data warehouse and then use the analysis tools to find the information of value

Secure Audit Records

One of the underlying principles to auditing effectively is security—for the auditing itself For auditing to hold its integrity as an effective mechanism, the audit logs or audit trail have to

be secured This may be a challenge because one of the prime uses of auditing is to detect unauthorized actions conducted by privileged users—that is, you need to secure auditing from the people who have access to the auditing

This is best explained through the following scenario: A nefarious person conducts a nefarious deed and the audit function captures this A strong requirement exists to keep the same nefarious person from deleting his malintentioned actions from the audit logs

Securing Audit at the Source Here’s the classic example: Someone accessing a database from

a DBA account alters core and critical data (Note this doesn’t have to be the DBA account—just assume it’s an account with a few critical system privileges.) The point here in our use case is that the data changes are not authorized They were allowed to occur—that is, the database was unable to prevent or protect the data because the account used possessed system privileges that allows him to change the data This is not an uncommon scenario, as you will see in Chapter 4 Many accounts (default and custom) have critical system privileges, and those privileges are required to perform legitimate and authorized work The problem occurs because the privileges

are not technically constrained to include only the authorized work (thus the need for an Oracle

Database Vault, as discussed in Chapter 4)

If the data is critical, database auditing has been enabled and it will be capturing updates to the data records This means the changes are captured via the database’s auditing functions Note that this is one of the best use cases for micro auditing: it captures privileged accounts abusing their privileges Let’s assume that the person knows that the database auditing is enabled In that database, with the proper system privileges, the person who changed the data can now simply go

to the audit logs and delete the records that recorded the original data manipulation This effectively covers his tracks from the scene of the crime

Note that you can turn on auditing for the audit logs themselves, thus generating an audit indicating that the audit logs were tampered with, but this gets us into a recursive discussion Also

note that, for completeness, you can audit the NOAUDIT command as well to detect someone

disabling auditing, doing bad things, and then later re-enabling auditing

How, then, do you secure the audit logs given that you may be auditing to detect unauthorized actions from privileged accounts? The answer is simply to move the audit logs from the database that is being audited to someplace else This solves the problem based on the assumption that the privileged accounts are accessible only for the database being audited and that the same people

do not have access to the audit destination You can write the audit information to a different location—perhaps the file system—or you can ship the audit trail to a different location, to protect the audit trail

This was the genesis of the conversation for many Oracle customers The topic required conversation because the extract process—recall that this is not only a good idea for security but also the first step to data warehousing—is not readily done in Oracle The audit trail could not be replicated via standard Oracle Advanced Replication technology, because the audit records by

default (prior to Oracle Database 10g R2) were owned by SYS To be specific, you cannot create

triggers on SYS-owned tables Triggers, however, are the heart of what makes Oracle Advanced Replication work

Trang 10

We will not discuss other alternatives to this dilemma here, other than to say that the solution exists with Audit Vault

Securing Audit at the Warehouse The conversation has now flowed from the need to obtain

cross-enterprise awareness to a discussion of how to build a data warehouse and perform analytics

of security at the audit source Both the notion of needing a data warehouse and the need for security have informed us that we want to move the data out of its source and into a single database, where it can be used for reporting and analysis

With the centralization of important data comes a requirement for increased security for that data This immediately puts a focus on the security of this audit data warehouse for the following reasons: Although the original user may not have access to the audit data warehouse, someone may still be able to make unauthorized changes to the data From an auditing perspective, especially when the audits are used for compliance reasons or for possible legal actions, the audit repository itself needs to be hardened Regardless, in most cases, the desire, if not the requirement, is to have a secure audit system

It does not make sense to send all of the security audits to an unprotected system This fact, combined with the need to run advanced reporting and analytics, usually rules out any file system–based approach, even if the files are stored in XML All of these reasons have driven conversations on how to secure a database in general In these specific situations, the databases require greater security because they are acting as a centralized auditing data warehouse

What to Audit and When to Audit

Given you have the capacity to audit, the discussion now moves to what actions should be audited or when should you do the auditing First, realize that you can’t audit everything To some, auditing everything sounds like a good idea: if you audit everything, you are guaranteed

to have audited and thus captured every bad thing that happened

While this is true theoretically, problems arise for two reasons The first and most obvious involves performance and resources Auditing consumes resources, which impacts performance For selective auditing, the impact may be negligible, but it will occur nonetheless Auditing everything maximizes the drain on resources In many cases, auditing or anything analogous (such as logging) is not used on every possible action for this very reason This leads us to the second logical reason for not auditing everything all the time: data proliferation

An audit-everything action means that the audit size will grow as rapidly as possible This

is not so much a storage concern as it is a “needle in the haystack” problem When you audit everything, all the legitimate actions are captured along with those you may be trying to find The problem is that the legitimate actions create audit records, possibly by the thousands, which conceal the audit records for the things you are most interested in The goal here is to capture only the needles while letting the hay fall through

Guiding Principles

When considering what to audit, the obvious conclusion is to audit the most critical and sensitive

data For example, if a data table is used for financial reporting, you may want to audit all DML actions on that table If done correctly, this will later allow you to verify that the numbers used for the financial reporting were not tampered with or manipulated in any unauthorized way by any unauthorized person In the case of a security breach, you need enough information to prosecute This information includes who logged in, from where, at what time, and what actions were performed Taken together, this information can imply malicious intent

Ngày đăng: 06/07/2014, 23:20

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN