Referencias | References


Referencias completas de vocabulario, eventos, crónicas, evidencias y otros contenidos utilizados en los proyectos relacionados con biotecnología y neurociencia de la KW Foundation.

Full references of vocabulary, events, chronicles, evidences and other contents used in KW Projects related to biotechnology and neuroscience.

Navegue por el glosario usando este índice.

Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS

Página:  1  2  3  4  5  (Siguiente)
  TODAS

D

Logo KW

Damajuana [103]

de System Administrator - lunes, 6 de enero de 2014, 23:17
 

 Damajuana

Botellón protegido, habitualmente, por un canasto de mimbre o plástico.

Logo KW

Data Center Protection [795]

de System Administrator - viernes, 29 de agosto de 2014, 17:01
 

Modernizing data center protection

 

  • Changes in workload recovery requirements and workload protection mechanisms.
  • The sheer amount of production and protection storage required.

Part one of our three-part series on modernizing data protection and disaster recovery takes a look at how data center backup and DR are evolving today.

Protection and recovery requirements are changing

As the platforms that host our production resources change, the protection methods must change with them. As one notable example, with the mass adoption of virtualization, many of the traditional methods for backing up server data have either evolved or been replaced or supplemented. Whereas each production server used to have its own agent, the ideal scenario for most environments today is to utilize virtualization host-centric data protection mechanisms, which provide hypervisor-specific APIs to enable whole (virtual) machine backups, while still offering granular restore capabilities. In addition, as production data continues to migrate from traditional data center servers to either mobile devices or cloud platforms, the protection and recovery requirements have to evolve accordingly.

Because of the increasing dependencies on data, the tolerance against downtime/data-inaccessibility of any kind is increasingly tight. But in order to gain a broader range of recovery agility, one must often use a broader range of protection mechanisms, including snapshots and replication, in addition to traditional backups.

Data growth is forcing changes in protection and recovery

The other primary driver -- beyond the desire to improve recovery agility and production-evolution -- is simply the necessity to change because the status quo is unsustainable with today's data growth. Enterprise Strategy Group research indicates that primary storage is growing by nearly 40% annually, but overall IT spending and storage-specific spending are growing at nowhere close to that rate. IT professionals are being forced to store data more effectively, while also increasing the types of protection and recovery capabilities. At first glance, those two trends might appear contradictory; but, in fact, the synergies between them are driving the most exciting parts of how IT is evolving from a backup mentality to a data protection strategy -- including not only backups, but snapshots and replication, as well.

Snapshots

While not necessarily new, the use of snapshots has evolved over the past few years. By reverting to a snapshot within primary storage, users can recover to a previous, albeit somewhat recent, point in time much faster than restoring from any backup on secondary storage. And, because of the very granular nature of snapshots, whereby disk blocks that aren't changed do not incur any storage consumption, snapshots can also partially address storage-scale issues related to multiple near-term copies held within a backup server's secondary storage pool.

Those capabilities aren't new, but the extended management and flexible usability of snapshots is -- and that is making all the difference. In the past, snapshots (as a storage-centric technology) were managed solely by the storage administrator and typically without coordination with the upper‑level applications or backup applications. Today, many storage array manufacturers have developed extensions so that snapshots of common business applications can be done in a more coordinated fashion; thereby ensuring a more application-consistent recovery. In addition, the usability of snapshots has evolved to enable granular file- or object-level restores that can be invoked by the snapshot management UI, an application/platform UI (e.g., database or compute-hypervisor), or from within the backup application. By integrating the management (invocation schedules for snaps and restores) and monitoring (health-awareness of the underlying storage), snapshots are now a much more holistic aspect of an overall data protection strategy.

Replication

While snapshots provide a complement to backups through rapidly restorable versions within the primary storage, replication creates yet another copy of the data -- most often on tertiary storage. This provides a survivable copy of data at a geographically separate location, typically as part of a business continuity or disaster recovery scenario.

It is essential to understand the mechanisms that are facilitating replication, which will affect the efficiency of the replication itself, as well as the usability of the data. Replication can be achieved at multiple levels within an infrastructure stack.

Application-centric replication (e.g., SQL database mirroring) is accomplished between the primary application engine and one or more partner application engines. It provides an immediately usable secondary instance of the data, since the entire stack (OS, platform and storage) exists under each application engine. Efficiency will vary by platform, but each platform must be managed separately -- through separate UIs, with separate strategies, often by separate individuals (e.g., database administrators).

OS/platform-centric replication encompasses a variety of technologies, including file-system centric replication (e.g., Windows Distributed File System/DFS), virtual-machine replication as facilitated between hypervisors, or third-party block- and file-centric replication offerings. Most of these products are designed to replicate data as part of enabling a high availability scenario. It is notable that resuming functionality may not be transparent to the users in many cases, but the switchover window is often negligible.

Storage-centric replication is the product that typically impacts CPU (application/server) the least, since the storage array does the work, which is often an external appliance with other advanced capabilities beyond replication. While storage-based replication achieves the same "data survivability" goals of other tiers of replication, the secondary instance of the data isn't necessarily for geographically separate scenarios. Some environments will replicate a second copy within the original or nearby site, so that the higher stack (application, OS, VM) has twin copies of data to access with transparent/synchronous capabilities. In other environments, the storage copies will be at separate facilities, but will require the second infrastructure stack to be recreated (in-advance or upon-crisis) before the secondary storage copy can be mounted and utilized.

Continuous data protection (CDP) and near-CDP. CDP products often combine some of the aspects of the other replication mechanisms: application-integration, multi-platform management and highly granular replication. Storage Networking Industry Association purists would also suggest that along with truly continuous replication, CDP products should also offer granular recovery to any of the infinite previous points of time using journal‑like behaviors, while near-CDP products provide the near-continuous (seconds or less latency) without the infinite/granular restore option.

By combining the agility of snapshots, the durability of replicas, and the flexibility of backups, you have what you'll need to truly modernize the protection of your data center.

Link: http://searchdatabackup.techtarget.com

Logo KW

Data Center [475]

de System Administrator - miércoles, 23 de julio de 2014, 20:14
 

 

Logo KW

Data Centers are the New Polluters [833]

de System Administrator - jueves, 4 de septiembre de 2014, 02:14
 

Data Centers are the New Polluters

By Patrick Thibodeau

IT managers may be too cautious about managing power, and businesses are unwilling to invest in efficiency, study finds.

U.S. data centers are using more electricity than they need. It takes 34 power plants, each capable of generating 500 megawatts (MW) of electricity, to power all the data centers in operation today. By 2020, the nation will need another 17 similarly sized power plants to meet projected data center energy demands as economic activity becomes increasingly digital.

Increased electrical generation from fossil fuels means release of more carbon emissions. But this added pollution doesn't have to be, according to a new report on data center energy efficiency from the National Resources Defense Council (NRDC), an environmental action organization.

In term of national energy, data centers in total used 91 billion (kilowatts) kWh in 2013, and by 2020, will be using 139 billion kWh, a 53% increase.

 

This chart shows the estimated cost and power usage by U.S. data centers in 2013 and 2020, and the number of power plants needed to support this increased demand. The last column shows carbon dioxide (CO2) emissions in millions of metric tons. (Source: NRDC)

The report argues that improved energy efficiency practices by data centers could cut energy waste by at least 40%. The problems hindering efficiency include comatose or ghost servers, which use power but don't run any workloads; overprovisioned IT resources; lack of virtualization; and procurement models that don't address energy efficiency. The typical computer server operates at no more than 12% to 18% of capacity, and as many as 30% of servers are comatose, the report states.

The paper tallies up the consequences of inattention and neglect on a national scale. It was assembled and reviewed with help from organizations including Microsoft, Google, Dell, Intel, The Green Grid, Uptime Institute and Facebook, which made "technical and substantial contributions."

The NRDC makes a sharp distinction between large data centers run by large cloud providers, which account for about 5% of the total data center energy usage. Throughout the data center industry, there are "numerous shining examples of ultra-efficient data centers," the study notes. These aren't the problem. It's the thousands of other mainstream business and government data centers, and small, corporate or multi-tenant operations, that are the problem, the paper argues.

The efficiency accomplishments of the big cloud providers, "could lead to the perception that the problem is largely solved," said Pierre Delforge, director of the NRDC's high-tech sector on energy efficiency, but it doesn't fit the reality of most data centers.

Data centers are "one of the few large industrial electricity uses which are growing," Delforge said, and they are a key factor in creating demand for new power plants in some regions.

Businesses that move to co-location, multi-tenant data center facilities don't necessarily make efficiency gains. Customers may be charged on space-based pricing, paying by the rack or square footage, with a limit on how much power they can use before additional charges kick in. But this model offers little incentive to operate equipment as efficiently as possible.

In total, the report says U.S. data centers used 91 billion kilowatt-hours of electricity last year, "enough to power all of New York City's households twice over and growing." By 2020, annual data center energy consumption is expected to reach 140 billion kilowatt hours.

If companies used data center best practices, the report states, the economic benefits would be substantial. A 40% reduction in energy use, which the report says is only half of the technically possible reduction, would equal $3.8 billion insavings for businesses.

The report also finds that energy efficiency progress is slowing. Once the obvious efficiency projects, such as isolating hot and cold aisles, are accomplished, addition investment in energy efficiency becomes harder to justify because of cost or a perception that they may increase risk. IT managers are "extremely cautious," about implementing aggressive energy management because it could introduce more risk to uptime, the report notes.

There are a number of measurements used to determine the efficiency of data centers, and the report recommends development of tools for determining CPU utilization, average server utilization, and average data center utilization. It says that "broad adoption of these simple utilization metrics across the data center industry would provide visibility on the IT efficiency of data centers, thereby creating market incentives for operators to optimize the utilization of their IT assets. "

The NRDC isn't the first to look at this issue. In 2007, the U.S. Environmental Protection Agency, working with a broad range of data center operators and industry groups, released a report on data center power usage that found that the energy use of the nation's servers and data centers in 2006 "is estimated to be more than double the electricity that was consumed for this purpose in 2000." It called for energy efficiency improvements.

This story, "Data Centers are the New Polluters" was originally published by Computerworld .

Patrick Thibodeau — Reporter

Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld.

Link: http://www.cio.com

 

Logo KW

Data centers in Iceland? Yes, really! [1166]

de System Administrator - viernes, 27 de marzo de 2015, 10:39
 

Data centers in Iceland? Yes, really!

The country’s year-round cool climate and abundant supply of cheap, renewable energy make it a natural for data center siting -- or so Iceland’s boosters hope.

By Tracy Mayor

Companies in search of stable, inexpensive energy to power their data needs are looking beyond the borders of their own countries these days. Those willing to look really far might consider harboring their strategic assets in Iceland.

Or such is the hope of Landsvirkjun, the national power company of Iceland.

Iceland supporters made the case that the country's year-round cool climate and abundant variety of low-cost, renewable energy sources make it an attractive site for data centers.

"Our power generation in Iceland is predominantly based on hydropower, but we are increasingly building out into geothermal turbines and now wind farms as well," says  Bjorgvin Sigurdsson, EVP of business development at Landsvirkjun. "Because all of these options are renewable, Iceland is able to make long-term agreements at fixed prices. And we're not influenced by the changes in commodity markets -- oil or gas or coal -- which gives our clients great visibility into the future."

That value proposition made sense to Verne Global, a UK-based provider of offshore data centers that has partnered with Landsvirkjun to build a 45-acre state-of-the-art data center on a former NATO base just west of Reykjavik, an area protected from seismic activity.

"What we are allowed to do in Iceland is connect right into a power grid that's the second most resilient in the world, and we can do that a very predictable cost -- we've got [cost] protections out to 2030," says Tate Cantrell, Verne Global CTO. "We can give [customers] a great deal of predictability for what that infrastructure is going to cost in Iceland. You can't do that just anywhere the world."

 

One client -- BMW Group -- has been able to cut electricity costs by some 80% by moving its high-performance computing applications to the Iceland facility, Cantrell says, noting that not only is the cost of energy inexpensive and stable, but that labor costs are competitive as well.

Verne Global is not the only company, nor Iceland the only country, to promote the idea of siting data centers in northern climes. Google has repurposed a paper mill into a data center in Finland; Microsoft maintains a data center in Amsterdam; and companies like Advania offer colocation services in Sweden and Iceland.

That said, Cantrell is particularly partial to the unique benefits of Iceland's location. "It's in the mid-Atlantic right between two of the biggest financial centers [New York and London] and also between two of the biggest data center infrastructures on the planet," Cantrell points out. "People think about Iceland as remote, but in fact it's quite strategic."

This story, "Data centers in Iceland? Yes, really!" was originally published by  Computerworld.

Link: http://www.cio.com

Logo KW

Data science jobs not as plentiful as all the hype indicates [1222]

de System Administrator - jueves, 7 de mayo de 2015, 20:21
 

Data science jobs not as plentiful as all the hype indicates

by Ed Burns

These days, it seems everyone is talking about data scientists. General interest in the role continues to grow, but that isn't leading to corresponding growth in jobs among employers.

Data scientist has famously been called "the sexiest job of the 21st century," and there can be no doubt that there's a lot of interest in data scientists and the work they do. That isn't surprising: There's something almost magical about being able to predict future business scenarios or find useful facts where others see only rows and rows of raw data. But despite all the hype, data science jobs aren't seeing especially high demand among employers.

The fact is, there's more call these days for data engineers, a less sexy position that typically involves identifying and implementing data analysis tools and working with database teams to ensure that data is prepared for analysis.

You couldn't tell that from Google: The number of searches on the term data scientist has shot up since 2012 and is continuing on a sharp upward trajectory (see Figure 1). By comparison, data engineer gets less than one-third as many searches.

 

Fig. 1

But checking job listings on LinkedIn returns nearly three times as many results for "data engineer" as "data scientist." That isn't a new trend. Figures from job listing site Indeed.com show that since 2006, the percentage of listings on the site for data engineers has held relatively steady at around 1.5% of total job listings on the site, with the exception of a brief upsurge followed by a corresponding downward correction around 2012 (see Figure 2).

Fig. 2

Job openings for data scientists are near their historic average today as well, but that's a much lower average, at barely above 0.15% of all listings. And the total number of data scientist positions listed is currently well below the most recent peak in 2013 (see Figure 3).

 

Fig. 3

Skilled data scientists in short supply

The disparity may be partly due to the fact that there are so few true data scientists available to hire. The mix of analytics, statistics, modeling, data visualization and communication skills that data scientists are expected to have makes them something of the proverbial unicorn. If businesses have realized that they aren't likely to find suitable candidates, it would make sense that they aren't bothering to post listings for data science jobs.

It could also be that companies just don't see that much value in hiring data scientists. They generally command large salaries due to their mix of skills. Hiring a data engineer to fix the info plumbing and a team of business analysts trained in self-service software like Tableau or QlikView to ask questions and get answers might make more sense economically.

Businesses are, after all, pragmatic. Glassdoor.com, another job listing site, estimates the national average salary for data scientists to be $118,709. Data engineers make $95,936 on average, while data analysts take home $62,379. Combined, a data engineer and a data analyst may cost more, but they're typically tasked with more general responsibilities that can have more concrete business value, and they should be able to get a lot more done than a single data scientist.

Data science jobs not so necessary?

There's also the question of need. A lot of businesses don't have big, complex data-related questions answerable only by Ph.D.-level data scientists. Many organizations' data problems are much smaller in scale and can be managed in self-service tools that don't require advanced statistical analysis skills or programming knowledge.

None of this is meant to diminish what data scientists can add to an organization. For businesses that have valid needs, they can be game changers. But there has been so much hype about the role that corporate executives could be forgiven for thinking that hiring a data scientist is equivalent to employing a business magician. In many cases, the reality simply doesn't match the level of exuberance.

Data-savvy young people who are considering which way to take their skills may want to take note. If you want a sexy job, become a data scientist. But if you want more job opportunities, and perhaps more job security, becoming a data engineer might be a better career choice.

Ed Burns is site editor of SearchBusinessAnalytics. Email him at eburns@techtarget.com and follow him on Twitter: @EdBurnsTT.

Next Steps

Logo KW

Data virtualization tools move into strategic IT realm [1386]

de System Administrator - jueves, 3 de septiembre de 2015, 18:33
 

Data virtualization tools move into strategic IT realm

by Sue Troy

Data virtualization technology is moving into the strategic realm for CIOs. Find out why, and learn about the problems it's solving for companies.

Data virtualization tools have been around for years, but the technology is shifting in importance from tactical to strategic, as businesses look to integrate and access data from across Web sources, social media and the Internet of Things (IoT).

n this Q&A, data management expert Rick Sherman, founder of Athena IT Solutions, explains why data virtualization should be on the CIO radar, the benefits it brings over traditional data integration tools, how it can be used for competitive advantage and which industries the early adopters are in.

Editor's note: The following interview has been edited for clarity and length.

Why is data virtualization an important issue for CIOs?

 Rick Sherman, founder, Athena IT Solutions
Rick Sherman
 

Rick Sherman: For CIOs, most companies -- small, medium and large -- are trying to access information from a lot of different sources both internally and externally. They might want to get information about their customers, share information with partners, buyers or whatever. In the old days they controlled the data, they could get to the data, they'd integrate the data. But now … data's coming from everywhere, not just from applications. Companies don't have the time or the money to physically integrate the data. Data virtualization is the ability to … extract the data from different sources and virtually be able to look at the data, analyze the data, combine the data with other sources. It's a perfect way in this fast-changing world to be able to get to data in real time … and analyze [and experiment with] the data. … it's perfect for what CIOs are confronted with: the onslaught of data they have.

It sounds like it's a need that didn't exist 10 years ago. Is it just in the past four or five years that data virtualization tools have become necessary, or have they been around longer than that?

Sherman: Well, like a lot of things in analytics, things have been around for a long time but the business need for them and the ability of the environment that we're in -- in terms of the amount of memory people have, network bandwidth -- [wasn't conducive to effective use of its capabilities]. The technology … has existed for a while but the business demand (social media, IoT, sensor devices, machine learning, Web data and a lot of the cloud data) [did not]. A lot of companies use cloud applications … so there's much more demand for this virtualization of the data and there's much more data that's scattered out there. Even though the technology existed before, the need for it has exploded and then the capabilities for that kind of technology to go after the volumes of data -- the unstructured data and structured data -- have all sort of grown based on the demand.

It's like a perfect storm where demand and capability have caught up to where it's a pretty practical and effective way to get the data.

What kind of advantages do data virtualization tools bring over traditional data integration tools or even application integration?

Sherman: Well, traditional data integration -- and I've been doing data integration for a long, long time -- is a great way … [to] get data into the state that you can analyze [it]. [These are] great technologies that have improved over the years. But it takes time to sort of find what that data is, figure how to cleanse data [and] put big data in one place. There is certainly still a need for data integration, application integration [and] data warehousing.

There [are some] use cases for data virtualization [instead of traditional data integration]. One is [if] it's a new source of data. You may need at some time later on to integrate the data but you want to get to the data now to analyze and look at it, see how useful it is, and you haven't gotten to the point where you can invest in getting it integrated. That's one use case scenario: the precursor of integrating it. There are plenty of other use cases where you never integrate the data with your source of data; you may not own the data. There's social media data, there's Web data, there's data that you might be exchanging between prospects, suppliers, partners and so on, that you may never own or have the ability or desire to integrate with your data. There's plenty of use cases for this data that's out there that you don't need to integrate. Both [scenarios] -- as a precursor to data integration or where you don't need to physically integrate the data but you need to analyze the data and bring it virtually into play -- are great use cases for data virtualization.

Is data virtualization technology something that can be used for competitive advantage or is it more sort of table stakes at this point for any business that's really serious about business intelligence?

Sherman: I think there are a lot of common uses of data virtualization at the tactical level. Applications like call centers use data virtualization all the time. That's tactical, that's just sort of keeping up with your peers. But, on a strategic level, data virtualization is just emerging as something that's more broadly used. To be able to get to the big data sources like social media, Web analytics, Internet of Things -- if you can use data virtualization to get that data fast and analyze it, that does give you a strategic or business edge as an advantage to it. It may not be the case five years from now, but at this point from a strategic point of view there is a competitive advantage to using it.

How would a call center use data virtualization?

Sherman: This has been a pretty common application for a while. [Say] you're in a call center in a financial services firm that has different lines of business. [It might] have bank accounts, business banks accounts, 401K, custodial accounts. Those are all in different applications, so the mortgage data might be in one location, credit cards might be in another, the kids' custodial account might be in another.

[If I call customer service] an account rep or customer support person [can] use virtualization to query across all those sources in real time and find out all the information related to me that they have. …

Some companies are [also integrating] social media data or Web analytics data. They have a customer prospect, they're providing some different services, they can see what products that customer orders. You might go to Amazon for an order; what product did you look at? If you're already a customer they might be looking at some campaign marketing tools [or] social media related to you. It could be related to what data they have on you as a customer or as a prospect and they could extend that to social media or Web analytics. It could extend to all types of data.

Would the strategic use of it be pulling in new pieces of information that people have not, up until now, been using to help in a call center discussion?

Sherman: Yes, and even more strategically than that, so the call center is sort of the tactical, practical use of it. More strategically, people might be using data virtualization to pull in social media, different marketing campaigns. Most companies have one or more marketing tools out there. They might have their sales pipeline information. They might have information that their partners or suppliers are providing relating to you, if their business is to customers directly or to other businesses. There's a lot of different information that can come in and they can analyze how effectively their marketing campaigns are running, their sales campaigns are running.

It can be used in healthcare [to assess] their patient population health. There's a lot of different metrics related to how many times you visit your primary care physician, different specialists, information related to different tests that were run on you in the physician's office or in a hospital. All that information can be brought in using virtualization to analyze your patient population.

There are a lot of different applications to it, more than just the data they have on you now that's in some applications in house. It's the ability to bring all this varied data together from a lot of different sources quickly in real time.

Are there specific verticals that are well-suited for data virtualization over others or does it have broad appeal across industries?

Sherman: Well, it certainly does have broad appeal but … the more information-intensive and information-astute industries are the first ones to use it. Financial services and different industries that do a lot of marketing analytics are two of the profiles that are the early adopters and more advanced users of virtualization. I mentioned healthcare; that [will] be a little bit of a late adopter in that they have to worry about privacy and security and some other issues.

In between … as companies get more into predictive analytics [and] more into analyzing information outside of their on-premises data, the more applicable this becomes. So any company that's using a lot of big data sources is … a prime candidate for it. … As data explodes and as more and more information becomes available, the use of that data and the use of virtualization of that data expands.

Next Steps

Explore different data virtualization technologies and tools in this

Link: http://searchcio.techtarget.com

Logo KW

Data Visualization Techniques [796]

de System Administrator - viernes, 29 de agosto de 2014, 17:10
 

Data Visualization Techniques: From Basics to Big Data with SAS Visual Analytics

 From Basics to Big Data With SAS® Visual Analytics

A picture is worth a thousand words. That's why choosing an optimal way to present data analysis can have a dramatic impact on effective decision making. This paper examines the options for data visualization, how to overcome some unique challenges presented by big data and quickly deliver data results to your audience even on mobile devices.

Please read the attached whitepaper

Logo KW

DATA-DRIVEN APPLICATIONS [592]

de System Administrator - lunes, 28 de julio de 2014, 19:43
 

ARCHITECTS LEAD THE NEXT GENERATION OF DATA-DRIVEN APPLICATIONS

Rapid delivery of modern applications demands an open, software-defined, scale-out storage platform.

Today’s application architects face significant change on all fronts. Asked to deliver a new era of social, mobile, and big data applications, they must do so at a time when the entire application stack is shifting around them. To successfully deliver modern applications, application architects must identify and build upon technologies that support their demanding requirements, while also future-proofing the enterprise.

Please read the attached white paper

Logo KW

Database Threats [739]

de System Administrator - miércoles, 13 de agosto de 2014, 00:04
 

Top Ten Database Threats

Learn How To Spot and Stop These Complex Threats

Corporate databases contain the crown jewels of an organization, which means a break-in, by insiders (employee) or outsiders (Hacktivist/Cybercriminal), can cost millions in fines, lawsuits, and customer attrition. The good news is there are 10 commonly used methods to attack databases. Defend against these, and you will have a highly secure database.

Databases have the highest rate of breaches among all business assets, according to the 2012 Verizon Data Breach Report. Verizon reported that 96% of records breached are from databases, and the Open Security Foundation revealed that 242.6 million records were potentially compromised in 2012.

 

Please read the attached whitepaper.

------------------------------------------------------------------------------------------------------------

Securing your Database

 

Regardless of whether you are running your SQL database (Yes, I know... I should be calling them Relational Databases) on Windows, UNIX, or Linux, the threats and vulnerabilities are very similar.
 
There are many SQL database vendors on the market. The fact that they need to function in a similar manner based on the SQL standards, makes them vulnerable to similar types of attacks.  These attacks can be initiated locally on the SQL server, or as far away as a client's Web Browser.
 
 
Below, I will outline some of the typical threats and misconfigurations that could endanger your SQL database. After each threat/misconfiguration, you will find some of the remediation strategies that can be used to reduce each risk.

Typical Threats

  • SQL injection - A SQL Injection is a hacking technique which attempts to pass unauthorized SQL commands through a web application with the intent of having them executed by the SQL server.
    • require SQL be constructed using parameterized queries, preventing SQL injection attacks by differentiating code from data.
    • require Web Inputs be sanitized and validated - cast type and length.
    • install Intrusion Detection/Prevention software and configure it to log and alert!
    • do not provide SQL error or warning messages to the public
    • limit database privileges
  • Network eavesdropping - Network Eavesdropping consists of capturing data packets on the network inline with the intenet of capturing passwords, session information, or confidential data.
    • require all communications between Webserver and Database Server be SSL/TLS encrypted.
    • also require all communications between developers/administrators workstations and database server be SSL/TLS encrypted
  • Unauthorized server access - Unauthorized server access can be related to poorly configured or patched systems, or improper management of account privileges.
    • ensure that database server and all connecting web and app servers upstream are patched in a timely fashion.
    • ensure that all unused accounts, services, and features are disabled
    • employ "Least Privilege Principle"
    • use local firewall to restrict source systems
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!!  Review and Alert!  (I can't tell you how important custom logging/alerting is!)
  • Password cracking - Password Cracking is the exercise of attempting to recover passwords from a file or network stream using various methods such as a dictionary attack.
    • use complex passwords.
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!! Review and Alert! (Did I mention how important this is?)
  • Denial of Service attack - An extension of the SQL Injection, a SQL Denial of Service attack will use any search forms available on your website to get your SQL database to execute a number of long running and CPU intensive queries so that the site becomes unusable to other users.
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!! Review and Alert! (Ok... this is important! Really!)

Typical mis-configurations 

  • Default, Blank & Weak Username/Password  - As in any computer system, Database password controls are critical in setting and maintaining security of the database.  Default passwords must be changed upon installation. Corporate Password Standards should be maintained for all accounts.
  • Unpatched Databases/Operating Systems - Database software, like the Operating System, is an integral part of your overall system security. Failing to patch in a timely fashion based on the vendor's recommendations places you in a state of higher risk.
  • Unnecessary Enabled Database/Operating System Features - Any user/service account, feature, or service, installed on a system, that is enabled but not used, is a potential vector for attack.  If you do not need an account,  feature, or service in your production database, turn it off or better yet, remove it.
  • Unencrypted sensitive data – at rest and in motion Todays servers have aqequate resources to negate any arguement around encryption/decryption performance. Any sensitive data that resides within the database should be encrypted.  I'm not saying encrypt entire tables or databases, unless required by compliance, but sensitive fields (Social Insurance Numbers, financial information, etc..) should be encrypted. Many SQL Vendors now also provide Transparent Data Encryption for full tables or Databases. Also ensure that any communications to and from the SQL server uses SSL/TLS encryption to prevent eavesdropping.
  • Location of SQL Database server/data - During installation, ensure that the DB software and Databases are on a separate drive/volume from the operationg system. This allows you to provide greater protection to the storage volume without adversely impacting the underlying OS. 

 

  • SQL injection - A SQL Injection is a hacking technique which attempts to pass unauthorized SQL commands through a web application with the intent of having them executed by the SQL server.
    • require SQL be constructed using parameterized queries, preventing SQL injection attacks by differentiating code from data.
    • require Web Inputs be sanitized and validated - cast type and length.
    • install Intrusion Detection/Prevention software and configure it to log and alert!
    • do not provide SQL error or warning messages to the public
    • limit database privileges
  • Network eavesdropping - Network Eavesdropping consists of capturing data packets on the network inline with the intenet of capturing passwords, session information, or confidential data.
    • require all communications between Webserver and Database Server be SSL/TLS encrypted.
    • also require all communications between developers/administrators workstations and database server be SSL/TLS encrypted
  • Unauthorized server access - Unauthorized server access can be related to poorly configured or patched systems, or improper management of account privileges.
    • ensure that database server and all connecting web and app servers upstream are patched in a timely fashion.
    • ensure that all unused accounts, services, and features are disabled
    • employ "Least Privilege Principle"
    • use local firewall to restrict source systems
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!!  Review and Alert!  (I can't tell you how important custom logging/alerting is!)
  • Password cracking - Password Cracking is the exercise of attempting to recover passwords from a file or network stream using various methods such as a dictionary attack.
    • use complex passwords
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!! Review and Alert! (Did I mention how important this is?)
  • Denial of Service attack - An extension of the SQL Injection, a SQL Denial of Service attack will use any search forms available on your website to get your SQL database to execute a number of long running and CPU intensive queries so that the site becomes unusable to other users.
    • install and configure Intrusion Detection/Prevention on database/web/app servers.
    • LOG!!! Review and Alert! (Ok... this is important! Really!)

Typical mis-configurations 

  • Default, Blank & Weak Username/Password  - As in any computer system, Database password controls are critical in setting and maintaining security of the database.  Default passwords must be changed upon installation. Corporate Password Standards should be maintained for all accounts.
  • Excessive User & Group Privilege - Employ both Principle of Least Privilege andSegregation of Duties between Administrative, Application, and User accounts.   
  • Unpatched Databases/Operating Systems - Database software, like the Operating System, is an integral part of your overall system security. Failing to patch in a timely fashion based on the vendor's recommendations places you in a state of higher risk.
  • Unnecessary Enabled Database/Operating System Features - Any user/service account, feature, or service, installed on a system, that is enabled but not used, is a potential vector for attack.  If you do not need an account,  feature, or service in your production database, turn it off or better yet, remove it.
  • Unencrypted sensitive data – at rest and in motion Todays servers have aqequate resources to negate any arguement around encryption/decryption performance. Any sensitive data that resides within the database should be encrypted.  I'm not saying encrypt entire tables or databases, unless required by compliance, but sensitive fields (Social Insurance Numbers, financial information, etc..) should be encrypted. Many SQL Vendors now also provide Transparent Data Encryption for full tables or Databases. Also ensure that any communications to and from the SQL server uses SSL/TLS encryption to prevent eavesdropping.
  • Failure to configure audit and transaction logging - Properly configured, audit and transaction logs will allow you to identify abnormal or malicious traffic to your server. If coupled with Intrusion Detection/Prevention software, will be able to alert and/or remediate the intrustion immediately.
  • Location of SQL Database server/data - During installation, ensure that the DB software and Databases are on a separate drive/volume from the operationg system. This allows you to provide greater protection to the storage volume without adversely impacting the underlying OS. 

And finally:  On UNIX Systems..... USE A CHROOT JAIL! 

A chroot provides a sandboxed environment to install a critical service that "looks like" the host operating system.  It has a file structure similar to the host, and has copies of all the files and libraries required to run the application, but is logically segregated from the operating system it is hosted on.  properly configured chroot environment will prevent privilege escalation from the hosted application to the hosting operating system. What happens in chroot, stays in chroot! 

Reference Material:

Default  SA accounts: (these need to be disabled or renamed once installation is complete)

Database Admin Account Name Admin Database
MSSQL sa master
Oracle sys, system dba_users
DB2 dasusr1  
MySQL root mysql
PostGreSQL postgres pg_shadow
Sybase sa, sccadmin master


Default TCP ports: (firewalls should restrict access on this port to only servers/workstations authorized to communicate with the DB server)

Database Ports Used
MSSQL TCP 1433
Oracle TCP 1521
DB2 TCP 3700
MySQL TCP 3306
PostGreSQL TCP 1433
Sybase TCP 4200
 

Página:  1  2  3  4  5  (Siguiente)
  TODAS