Referencias | References


Referencias completas de vocabulario, eventos, crónicas, evidencias y otros contenidos utilizados en los proyectos relacionados con biotecnología y neurociencia de la KW Foundation.

Full references of vocabulary, events, chronicles, evidences and other contents used in KW Projects related to biotechnology and neuroscience.

Navegue por el glosario usando este índice.

Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS

Página:  1  2  3  4  5  6  7  8  9  10  (Siguiente)
  TODAS

S

Logo KW

Salsipuedes [1696]

de System Administrator - sábado, 17 de septiembre de 2016, 12:37
 

Charrúas, La Matanza de "Salsipuedes"

Documental:

Logo KW

San Francisco launches network just for "Internet of Things" devices [1535]

de System Administrator - jueves, 29 de octubre de 2015, 14:44
 

San Francisco launches network just for "Internet of Things" devices

By Chris Nehls

San Francisco this week became the first city in the United States to have a wireless infrastructure solely for use by Internet-connected devices.

The City by the Bay will sponsor a one-year pilot program with French-based Sigfox to provide a network for devices that are part of the "Internet of Things." Sigfox has installed about 20 suitcase-sized antennae on libraries and other city-owned buildings that pass along the small amounts of data devices like smart parking meters and water sensors transmit.

Roughly 20 antennae were enough to cover all of San Francisco because the antennae operate at different frequencies and at lower power than existing Wi-Fi or cell phone service. Having a parallel network just for IoT devices also keeps them from sapping other wireless signal strength.

The year-long pilot, run at no cost to the city, will allow local startups and developers to build new devices and applications using the network infrastructure. Sigfox will host a weekend hackathon for developers in November to encourage local tech mavens to get started.

San Francisco is the first of 10 cities Sigfox plans to provide coverage for connected devices by the first quarter of 2016. As Forbes reported, the company will charge about a dollar a year per device connected to its network.

For more:
- read the Forbes story
- read the San Francisco Chronicle's coverage
- read this San Francisco Business Times story

Related Articles:

Link: http://go.questexweb.com

Logo KW

San Vincente Pallotti [156]

de System Administrator - martes, 7 de enero de 2014, 23:46
 

 Pallotti

San Vincente Pallotti (en italiano Vincenzo Pallotti) 1795-1850, era un sacerdote italiano, fundador del Apostolado Católico, congregación conocida como los Padres Palotinos o simplemente Palotinos. Es considerado como el precursor de la Acción Católica Mundial.

Fuente: http://es.wikipedia.org/wiki/Vicente_Pallotti

Logo KW

Sandboxing [476]

de System Administrator - martes, 24 de junio de 2014, 13:13
 

Sandboxing Dinámico

El malware ha avanzado de tal manera que puede determinar si se encuentra en el sandbox de una herramienta de Seguridad informática.

Sandbox

Dentro del ámbito de la seguridad del software, una de los conceptos de los que más se suelen hablar con las vulnerabilidades, es decir, fallos de seguridad que, en determinadas circunstancias, pueden ser aprovechadas por terceros y ejecutar código malicioso (malware) en el equipo de un usuario. Además de mantener actualizadas nuestras aplicaciones y aplicar los distintos parches de seguridad que se vayan publicando, desde hace algún tiempo venimos oyendo hablar del Sandboxing, un mecanismo de seguridad que se utiliza en aplicaciones tan conocidas como Adobe Reader X o en Google Chrome.

¿Y qué es el Sandboxing? Este “cajón de arena” (traducción literal al castellano) es un aislamiento de procesos, es decir, un mecanismo que implementan varias aplicaciones para ejecutar aplicaciones y programas con seguridad y “aislarlas” del resto del sistema dentro de una especie de contenedor virtual desde el cual controlar los distintos recursos que solicita dicha aplicación (memoria, espacio en disco, privilegios necesarios, etc). Este férreo control al que se somete el proceso sirve para discernir si el código a ejecutar es malicioso o no puesto que, por norma general, se restringirá cualquier tipo de acceso a dispositivos de entrada o de inspección del sistema anfitrión.

Gracias al sandboxing, por ejemplo en Google Chrome, el navegador de Google es capaz de “aislar” pestañas de navegación entre sí y, además, impedir que una página web con contenido malicioso intente instalar cualquier tipo de software en nuestro sistema, monitorizar lo que estamos haciendo o acceder a información alojada en nuestro disco duro (y entre las aplicaciones que aisla, Flash es una de ellas). De hecho, este mecanismo también se incluyó en Adobe Reader X porque uno de los grandes “coladeros” de Adobe era la posibilidad de esconder código en los archivos PDF y que éste se ejecutase al abrir un “documento malicioso” porque la aplicación no impedía ni controlaba peticiones que iban más allá de mostrar el contenido del archivo (y se daban casos de cambios en el registro de Windows o la instalación de software en el sistema).

Además de proteger a los usuarios, esta técnica también es utilizada por equipos de seguridad, por ejemplo, para estudiar malware dentro de un entorno controlado y ver qué efectos tiene en un sistema para proceder a su caracterización, también podemos utilizar aplicaciones para “aislar” a otras y probarlas de manera segura (importante si sospechamos de ellas aunque, ante la duda, es mejor no instalarlas aunque usando Sandboxie o Glipse podremos mejorar la seguridad de nuestras pruebas). A mayor escala, la virtualización de un sistema operativo dentro de otro es también una forma práctica de sandboxing.

Fuente: http://bitelia.com/2012/08/que-es-el-sandboxing

Blue Coat realizó una investigación en la que concluye que el malware ha avanzado de tal manera que puede determinar si se encuentra en el sandbox de una herramienta de Seguridad informática. Por lo que concluye que el próximo paso en Seguridad es el sandboxing dinámico, con la capacidad de replicar entornos reales.

La empresa de Seguridad Informática Blue Coat, especializada en entornos de negocio de alta complejidad, presentó la lista resumida de las principales características que debe tener una solución de Sandboxin Dinámico a la hora de tomar decisiones.

1. Varias metodologías de detección: tanto la emulación como la virtualización permitirán la detección de malware con reconocimiento de máquina virtual.

2. Entornos virtuales realistas y personalizables: pueden replicar perfectamente los entornos reales de producción, para combatir los ataques actuales más sofisticados.

3. Clasificación basada en el comportamiento y puntuación de riesgo personalizada que debe indicar por qué un archivo de muestra o una dirección URL fueron marcados como maliciosos, y no solo informar un resultado “bueno o malo”. Los patrones también deben proporcionar una clasificación del riesgo, deben abarcar todas las posibilidades, desde un comportamiento malicioso genérico hasta patrones de comportamiento específicos de cada familia (por ejemplo, los troyanos bancarios).

4. Acceso a recursos integrales de análisis y datos de eventos: para que su solución de análisis de malware se encuentre disponible para su equipo de seguridad.

5. Inteligencia accionable contra amenazas compartidas. Los datos sobre ataques se deben volver a incorporar “en forma ascendente” dentro de la red de inteligencia global, de manera que los futuros ataques se puedan bloquear en el origen, y así evitar ataques polimórficos.

La tecnología de Sandboxing dinámico sustenta la posibilidad técnica de proporcionar una defensa sólida e integral contra las amenazas sofisticadas que existen en el escenario actual. Si bien es un paso importante para desarrollar una estrategia general más sólida en cuanto a la Seguridad Informática, una defensa de múltiples capas debe tener la capacidad de bloquear amenazas conocidas avanzadas y persistentes, detectar de manera proactiva el malware desconocido y el que ya está presente, y automatizar la contención de incidentes posteriores a la intrusión y su resolución.

Fuente: IT-Sitio / 24.06.2014

Logo KW

Sanguinetti Coirolo, Julio María

de System Administrator - miércoles, 8 de enero de 2014, 17:31
 

 Julio María Sanguinetti

Julio María Sanguinetti Coirolo (Montevideo, 6 de enero de 1936) es abogado, historiador, periodista y político uruguayo. Desempeñó diversos cargos gubernamentales. Fue tres veces diputado, dos veces ministro, senador y Presidente de la República en dos períodos: 1985-1990 y 1995-2000. Es miembro del sector "Foro Batllista" del Partido Colorado, del cual fue Secretario General desde 1983 a 1985 y de 2004 a 2009.

Fuente: http://es.wikipedia.org/wiki/Julio_María_Sanguinetti

Logo KW

Say the Big, Bad ‘B’ Word: Bitcoin and the Internet of Money [1580]

de System Administrator - domingo, 15 de noviembre de 2015, 22:54
 

Say the Big, Bad ‘B’ Word: Bitcoin and the Internet of Money

By Jeremy Allaire, Co-Founder and CEO, Circle

It has been a bit amusing watching as intellectuals and “thought leaders,” financial industry executives, technologists and even the media itself have fallen in love with “the blockchain,” “distributed ledgers” and other phrases to talk about the innovation happening in how trust and value exchange work on the Internet.

Notably absent from this high-minded and purportedly insightful thinking is reference tobitcoin, the actual open platform that distributes trust and provides a highly secure ledger to exchange value around the world.

If bitcoin is referenced, it’s dismissively, as if smart people are in the know that bitcoin (the digital asset) isn’t necessary or important to fueling this global network of distributed and decentralized trust. For the most part, it’s just a cop-out and intellectual laziness; I have yet to meet anyone who shares these thoughts who actually has any idea how any of the technology actually works.

If bitcoin is referenced, it’s dismissively, as if smart people are in the know that bitcoin (the digital asset) isn’t necessary or important to fueling this global network of distributed and decentralized trust. For the most part, it’s just a cop-out and intellectual laziness.

When pressed, proponents of “permission blockchains” espouse the need for control, for building multilateral consensus-based settlement networks; they speak of the incredible savings in IT expense and back offices of financial institutions. These are real use cases, and bitcoin’s blockchain can’t solve all use cases, just as HTTP doesn’t solve every form of data and information transfer on the Internet. But for the most part, what people are saying is that “they want all the benefits of bitcoin, without bitcoin.”

Imagine back 20 years, when the core protocols of the Internet were getting going (HTTP/HTML, SMTP, etc.), and venture capital began to pour into software and Internet companies, while the leading media, telecom and communications companies plotted to create their own private, gated online services. Remember, the Internet was unreliable, insecure, and filled with creeps and hackers. People wanted safe, secure, trusted and proprietary networks. That was the future. Throughout the world, national monopolies, government-run telecoms companies and large commercial industry players drafted up strategies for building closed and proprietary systems to deliver “the information superhighway” to consumers.

All of these companies were interested in the “technology behind the Internet,” but not in the Internet itself. They sought to build “permission Internets,” where they could control what went online, who could publish and who could access the information. And they’d take a toll, as they always had, for providing these commodity information utilities.

We all know what happened. Smart creators and engineers from all around the world got inspired by the open Internet, and rather than waiting around, started coding and building on these limited, but conceptually transformative, protocols. More and more people got connected to the network. Permissionless innovation took hold, and we changed the world.

There are so many lessons for us here, and when we think about where we should apply our energy and focus, what ideas we should support and build upon, we need to have intellectual honesty and not fall into the trappings of established power and institutional biases.

Why are we here, working on this project? What’s important about the work, and how does that relate to the technology choices and investments that we make?

It’s an incredibly unique moment in human history — in the past 10 years, we’ve connected almost every human on the planet, we’ve radically expanded access to knowledge and information, we’ve made it possible for commerce and communications to flow in ways that were unimaginable just decades ago.

What was it about the open Internet that helped all of this to grow and flourish? What lessons can we draw from that when we think about the prospects for bitcoin and blockchain technology?

The DNA of the Internet

I like to think of the Internet as having a set of embedded ideals, a kind of intellectual and ultimately technical DNA.

 

The Internet uses a distributed and decentralized architecture, originally conceived by the Defense Department as a way to provide greater security and no single point of failure in the network. With its commercialization, the Internet is also a permissionless network. Anyone can join by putting a computer on the network that speaks sufficient protocols. You don’t need permission to run a mail server or Web server, you just plug in.

The Internet is not truly governed by anyone — no government controls the Internet; no private corporation controls the Internet. If anything, the Internet is governed by open and free intellectual property, by open standard protocols that technical contributors around the world help to shape year in and year out, through standards organizations, hacking and open source software. All of this intellectual property is free for the world, a gift to the global commons. It’s also now the very fabric of our global society and economy.

Encoded in this DNA is also a very deep democratic and liberal ethos focused on freedom of expression, personal privacy and creative freedom; despite attempts to limit this, that ethos still breaths heavily even through the darkest and most controlled parts of the Internet.

Bitcoin itself is an outgrowth of this same DNA: A distributed and decentralized network, based on open standard protocols, open source software, that is permissionless — anyone can join and use the network. But instead of providing protocols for sharing information and communications, bitcoin and the blockchain network are for sharing value in a trustless fashion.

In order to create a transaction settlement network without a central intermediary, there needs to be a way to secure the network from abuse, especially if the network’s purpose is to record ownership in an irrefutable manner. With bitcoin’s protocols, newly created ledger entries themselves act as a token which market participants can earn by conducting work on behalf of the network. And, once issued, those tokens are what allow value to move on the ledger. This ingenious model, leveraging proof of work, has demonstrated tremendous resilience. To date, no one has devised a better scheme to build a global trust and transaction ledger.

Do we really think that financial industry consortia and alt-chain startups can challenge the open Internet? Does anyone really think these “permissioned blockchains” have an iota of a chance at accomplishing anything more than building buggy, insecure closed back-end IT systems for banks?

Of course, many people are now starting to realize that this $5 billion asset and secure ledger are quite powerful, and that many other layers of applications can be built on this infrastructure. With greater extensibility and programmability, bitcoin can evolve to enable transformations in how all forms of property are secured and exchanged, how voting and governance function, including spilling into the automation of commercial law, audit and accounting. Essentially, just as the early Internet disrupted and transformed media and communications, this wave will disrupt the “trust and assurance” industries, which includes government, law, accounting, insurance and, last but not least, finance.

But it’s not a currency

On one point, I think some of the “blockchain is good, bitcoin is bad” narrative is reasonably accurate. Bitcoin the token is not a government-issued currency. At this stage in its evolution, I don’t think it’s helpful to think about bitcoin as a currency, but rather as a new kind of asset — a “digital asset” — that provides a valued token necessary to exchange value in a secure manner globally. It’s the fuel that powers the blockchain network, and it’s increasingly important that there be robust liquid marketplaces globally to access and use this powerful digital asset, as it can be used to secure and exchange nearly anything of value, including fiat currencies.

At this stage in its evolution, I don’t think it’s helpful to think about bitcoin as a currency, but rather as a new kind of asset — a “digital asset” — that provides a valued token necessary to exchange value in a secure manner globally.

So what about a hybrid digital economic model? Most people don’t want a new currency, they are quite happy with their dollars, euro, pound, yuan and yen. They get paid in these currencies, they pay their taxes in these currencies, they understand their purchasing power in these currencies, etc.

But, pretty uniformly, people want the benefits of bitcoin and the blockchain — near-instant transfers, globally available on any Internet-connected device, highly secure and nearly free value transfers.

In such a model, consumers and businesses rely on whatever currency is important in their lives, but can share value globally and instantly for free, by using the bitcoin blockchain as an open and secure ledger and settlement network. For the duration of a transfer, local currencies are converted into and out of “digital tokens,” and the transaction is settled over the Internet.

Soon, we’ll be able to share value globally, and soon thereafter, we’ll start to layer in other forms of value — property, securities, insurance — and then later we’ll build in smart rules and business logic that can run on this global secure network in a trustworthy manner, and we’ll all be part of constructing new rules of global commercial and legal governance.

Everyone, it’s okay to say the word “bitcoin” and acknowledge that it is the actual platform that is driving this innovation that we’re all building on. It’s also okay to say “the bitcoin blockchain,” or “the blockchain,” if you’re afraid that people will think you’re weird. But until someone actually builds, ships and scales a global open platform, open like the rest of the Internet, that can do what the bitcoin blockchain can today, there’s really nothing else to talk about.

Jeremy Allaire is an Internet entrepreneur who has spent the past 20 years building and leading global technology companies with products used by hundreds of millions of consumers and millions of businesses worldwide. Today, Allaire is co-founder and CEO of Circle, a consumer Internet company focused on transforming the world economy with secure, simple and less costly technology for storing and using money. Reach him @jerallaire.

Link: http://recode.net

Logo KW

Scaling SaaS Delivery for Long-Term Success [1030]

de System Administrator - miércoles, 24 de diciembre de 2014, 01:50
 

Scaling SaaS Delivery for Long-Term Success

Critical Cost and Performance Considerations to Ensure Sustainable & Profitable Service Delivery

Adoption of Software-as-a-Service (SaaS) solutions is accelerating. Every software industry survey and market forecast says so. This is because SaaS applications are almost always more flexible, versatile, and cost-effective than traditional on-premises solutions.
This trend represents a great opportunity for established SaaS companies to expand their customer base. But it also presents challenges for SaaS providers seeking to gain a competitive advantage and to build sustainable businesses over the long-haul. Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) resources make it easier and more economical for companies to launch SaaS businesses. They eliminate many service delivery and software development barriers to entry. This, in turn, has opened the door to a proliferation of players vying for market share.

This ‘Cloud Rush’ affect is commoditizing many segments of the SaaS marketplace. Competitors are often willing to undercut each other on price to win over customers. At the same time SaaS has made it much easier for customers to switch solutions if they experience availability or performance issues. Savvy SaaS executives recognize that their service delivery requirements increase as their business grows and the demands of their customers expand. Many leading SaaS companies leverage a combination of public and private cloud resources to meet the varying needs of their growing customer base.

THINK strategies believes it is critical for SaaS companies to team with leading data center providers that offer multi-cloud capabilities to meet their mission critical service delivery needs. In particular, they should partner with providers that offer a variety of public cloud and hosted service alternatives on a global scale to ensure the availability and performance of their SaaS solutions.This whitepaper discusses these imperatives, and provides THINKstrategies’ recommendations for SaaS executives looking for the best service delivery alternatives to capitalize on today’s market opportunities.

 

Please read the attached whitepaper

Logo KW

Science & Religion: A Centuries-old War Rages On [1235]

de System Administrator - miércoles, 20 de mayo de 2015, 21:05
 

 

Science & Religion: A Centuries-old War Rages On

While some in the scientific and religious communities have declared an end to the tensions between faith and fact, the conflict continues to have impacts on health, politics, and the environment.

By Jerry A. Coyne

The battle between science and religion is regularly declared over, with both sides having reached an amicable truce. “Accommodationists” on both the religious and scientific sides assure us that there is no conflict between these areas, that they deal with separate spheres of inquiry (science deals with the natural world, religion with meaning, morals and values), or even that they can somehow help each other via an unspecified “dialogue.” After all, we’re told, there are many religious scientists (two notables in my field are Francis Collins, director of the National Institutes of Health and evangelical Christian, and Kenneth Miller, an observant Catholic who is also biologist at Brown University), so how can there be possibly be a conflict?

But despite these claims, the dust hasn’t settled. Why the continuing publication of accommodationist books if the issue was resolved long ago? Why do 55 percent of Americans aver that “science and religion are often in conflict”? Why are less than 10 percent of all Americans agnostics or atheists, yet that proportion rises to 62 percent of all scientists at “elite” universities, and to 93 percent among members of the National Academy of Sciences? In a poll taken in 2006, 64 percent of Americans claimed that if science contradicted one of the tenets of their faith, they’d reject the science in favor of their faith. Clearly, there is still friction between science and religion, even if some scientists can leave their faith at the laboratory door.

In fact, the conflict between science and religion—at least the Abrahamic faiths dominant in the U.S.—is deep, endemic, and unlikely to be resolved. For this conflict is one between faith and fact—a battle in the long-fought war between rationality and superstition.

Why is there such concern about conflict between religion and science, as opposed to between, say, sports and science, or business and science? It’s because science and religion are both in the business of determining what is true in the universe—although religion has other concerns as well. Science’s ambit is well known, but it’s also important to realize that religion also depends heavily on claims about what is true: claims about the existence and nature of gods, how one’s god wants you to behave, the occurrence of miracles, and whether there are eternal souls, untrammeled free will, and afterlives.

This fact-dependence of faith is recognized by most theologians. As renowned religious scholar Ian Barbour noted, “A religious tradition is indeed a way of life and not a set of abstract ideas. But a way of life presupposes beliefs about the nature of reality and cannot be sustained if those beliefs are no longer credible.” Nearly every faith has some non-negotiable beliefs about reality. The foundational claim of Christianity, for instance, is that Jesus was a divine savior whose acceptance gains us eternal life. Factual belief is pervasive: According to a 2013 Harris poll, 64 percent of all Americans believe in the survival of the soul after death, 57 percent that Jesus was born of a virgin, and 58 percent in the existence of Satan and hell.

But while science and religion both claim to discern what’s true, only science has a system for weeding out what’s false. In the end, that is the irreconcilable conflict between them. Science is not just a profession or a body of facts, but, more important, a set of cognitive and practical tools designed to understand brute reality while overcoming the human desire to believe what we like or what we find emotionally satisfying. The tools are many, including observation of nature, peer review and replication of results, and above all, the hegemony of doubt and criticality. The best characterization of science I know came from physicist Richard Feynman: “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that.”

In contrast, religion has no way to adjudicate its truth claims, for those claims rest on ancient scripture, revelation, dogma, and above all, faith: belief without sufficient evidence. Is there one God, or many? Does he want us to work on the Sabbath? Is there an afterlife? Was Jesus the son of God? The problem, of course, is that faith is no way to decide what’s true. It is, à la Feynman, an institutionalized way of fooling yourself. Religion acts like science in making claims about reality, but then morphs into pseudoscience in the way it rejects disconfirming evidence and insulates its claims against testing. The toolkit of science is—and will remain—the only way to discover what’s real, whether in biology, physics, history, or archaeology. Religion can offer communality and can buttress morality, but has no purchase on truth.

But even if science and religion are incompatible, what’s the harm? Most of the damage comes from something inherent in many faiths: proselytizing. If you have a faith-based code of conduct attached to beliefs in absolute truths and eternal rewards and punishments, you’re tempted to impose those truths on others. The most obvious subjects are children, who are usually indoctrinated with their parents’ brand of faith. That can produce not just psychological but physical harm: 43 of 50 U.S. states, for instance, have laws exempting parents from prosecution if they harm their sick children by rejecting science-based medicine in favor of faith healing. Forty-eight of our 50 states allow religious exemptions from vaccination. The results are predictable: children needlessly become sick, and some die. And we’re all complicit in those laws, which are based on unquestioning respect for faith.

There is also “horizontal” proselytizing: pressing faith-based beliefs on others via politics. This has led to religion-based opposition to things like global warming, condom use for preventing AIDS, and abortion. It’s unlikely that any of this would exist if people were to privilege reason over faith.

In the end, in both science and everyday life, it’s always good policy to hold your beliefs with a tenacity proportional to the evidence supporting them. That is the foundation of science and the opposite of religion. As the philosopher Walter Kauffman noted, “Belief without evidence is not a virtue, but opens the floodgates to every form of superstition, prejudice, and madness.”

Jerry A. Coyne is a professor of ecology and evolution at the University of Chicago. His 2009 bestseller, Why Evolution is True, was one of Newsweek’s “50 Books for Our Time.” Faith vs. Fact: Why Science and Religion are Incompatible goes on sale today.

 

Logo KW

Scientific Linux [754]

de System Administrator - viernes, 15 de agosto de 2014, 20:15
 

Scientific Linux, el sistema operativo del Colisionador de Hadrones

Por Marcos Merino

Los dos colisionadores de hadrones más potentes del mundo se encuentran en el CERN (Suiza) y en el Fermilab (EE UU). Ambos laboratorios punteros llevan tiempo usando Linux para gestionar sus sistemas informáticos de investigación. En el caso de Fermilab, empezaron elaborando Fermi Linux (basada en Red Hat Enterprise Linux) en 1998. Cuando, seis años después, el CERN empezó el proceso de migración desde RHEL hacia su propia distribución, ambos laboratorios entraron en contacto y unieron esfuerzos para desarrollar una única distribución de Linux basada en RHEL y que adoptaría el nombre de Scientific Linux.

Scientific Linux es una distribución muy similar a la exitosa CentOS: ambas son clones de RHEL que no usan su artwork ni su marca pero recompilan desde cero su código (manteniendo así la compatibilidad a nivel binario, aunque SL introduce algunas personalizaciones en su configuración, y software extra) yson, posiblemente, los tres ‘sabores’ de Linux más robustos y fiables para su uso en servidores y otros entornos de producción. En el LHC, Scientific Linux se ejecuta (ya virtualizado, ya de forma nativa) sobre las miles de CPUs (PC y Mac) que procesan los 10-15 petabytes anuales de datos producido por el gran colisionador, por lo que este sistema operativo tuvo un papel protagonista en la confirmación de la existencia del Bosón de Higgs (la conocida popularmente como ‘partícula de Dios’).

Sin embargo, no debemos caer en el error de pensar que Scientific Linux es una distribución “sólo para científicos”: no sólo cualquiera de nosotros puede descargarlo y usarlo tanto en servidores como -por ejemplo- en nuestro portátil personal, sino que ni siquiera está dotado de un exhaustivo pack de software científico (al fin y al cabo, habría tantos packs posibles como disciplinas científicas). No, el objetivo de SL es dotar de una base común de software a los laboratorios y grupos de investigación, para evitar tener que duplicar esfuerzos con determinados desarrollos y poder contar con soporte durante varios años (por ejemplo: la última versión hasta ahora de Scientific Linux, lanzada en 2012, contará con actualizaciones hasta el año 2020).

La reciente unión de fuerzas entre Red Hat y CentOS a principios de este 2014, sin embargo, podría llevar a Scientific Linux a repensar su enfoque. En su web explican que tras las conversaciones del grupo de desarrolladores de SL con los responsables de CentOS y Red Hat, así como con el CERN y Fermilab, se mantiene el compromiso con la meta original de Scientific Linux (“proporcionar una plataforma estable, de código abierto y adecuado soporte para cubrir las necesidades de los experimentos de física de alta energía”), aunque “se tendrá en cuenta a los usuarios ajenos” a este ámbito a la hora de tomar futuras decisiones, entre las que podrían estar sumarse a un “grupo de interés especial” de CentOS y plantear el desarrollo de la próxima versión de SL como un derivado de CentOS 7 específicamente dirigido a la comunidad científica.

Marcos Merino

Marcos Merino es redactor freelance y consultor de marketing 2.0. Autodidacta, con experiencia en medios (prensa escrita y radio), y responsable de comunicación online en organizaciones sin ánimo de lucro.

Logo KW

Scouting [236]

de System Administrator - viernes, 10 de enero de 2014, 23:15
 

En gestión empresarial, la palabra “scouting” se refiere a la tarea de selección y reclutamiento de recursos humanos. Incluía búsqueda y selección en otras fuentes útiles, como las ideas.


Página:  1  2  3  4  5  6  7  8  9  10  (Siguiente)
  TODAS