The pilot project is devising a framework that is being scaled up with respect to content volume and diversity to serve all levels and disciplines of learners. It is being developed at Indian Institute of Technology Kharagpur. NDL India is a conglomeration of freely available or institutionally contributed or donated or publisher managed contents.
Almost all these contents are hosted and accessed from respective sources. The responsibility for authenticity, relevance, completeness, accuracy, reliability and suitability of these contents rests with respective organization from where the contents are sourced and NDL India has no responsibility or liability for these.
- Key Databases.
- Turbo Code Applications: a Journey from a Paper to realization;
- Turbo Code Applications: A Journey from a Paper to Realization!
- Pediatric Board Review Symposium!
Every effort is made to keep the NDL India portal up and running smoothly. However, NDL India takes no responsibility for, and will not be liable for, the portal being unavailable due to technical issues or otherwise. For any issue or feedback, please write to ndl-support iitkgp. Access Restriction Subscribed. Log-in to view content. FAQ Help. Member Log-In. This session will show you how to build a docker-based solution that incorporates async messaging, CQRS patterns, fit for purpose data back-ends and eventual consistency as part of a microservices solution.
In addition, the session will address specific challenges such as dealing with message order, versioning, poison messages and data protection. Speaker: Michele Bustamante, Solliance. Practical Istio A lot of ink has been spilled describing what Istio is and the long list of features it provides. In this talk, we move past the overview and dive in to specific problems that companies are solving using parts of Istio today. We'll walk through several concrete use cases including: how one company made an existing system highly available HA with Istio; how another company with strict security requirements uses Citadel to manage certificates across their fleet and secure service to service communication; and finally how a third company is using Envoys and Pilot to manage traffic in their deployments, enabling them to break down their monolith into a set of microservices without affecting the other teams in their organization consuming the monolith's functionality.
To make things concrete we'll walk through demos of each use case in a live cluster, showing how Istio can be deployed and integrated with applications to meet the requirements of these use cases. Speaker: Zack Butch, Tetrate.
Download Turbo Code Applications A Journey From A Paper To Realization
Since then there has been an explosion of capabilities and tools based on eBPF, so you've probably heard the term, but do you know what it is and how to use it? In this demo-rich talk we'll explore some of the powerful things we can do with this technology, especially in the context of containers. Speaker: Liz Rice, Aqua Security.
But if that stack runs your deployments, what deploys it? See how we use Docker to make that happen, and see if it can help to run your own internal CI stack, or possibly think of some other uses for the pattern!
Services on Demand
Data Tells Us You will Love This Talk The Data Team at Docker builds and maintains pipelines for all functional units within the company while dogfooding the latest versions of our products. Join us as we dive into the data engineering infrastructure that we use to deliver metrics and insights to different audiences.
We will also share some of our key learnings and best practices as a result of our effort to create a data-centric culture within the company. If you are curious about techniques to increase your processing power with GPUs and are eager to see some cool Industrial AI demos, all running in Docker, join us for this talk at DockerCon ! Improving the Human Condition with Docker RTI International is an independent nonprofit research institute dedicated to improving the human condition, looking into areas of crime analytics, health economics and more.
Online Feature Extraction and Event Generation for Computer-Animal Interaction This talk will present an architecture developed to investigate the interaction with and between animals. The architecture allows online processing of multimedia streams and the generation, storing and visualizing of events using feature extraction. It allows biologists to analyze the events by monitoring live or by replaying streams through a web interface Docker swarm is the central component of the architecture and serves as infrastructure for stream processing, event generation, event processing and visualization.
The main entry point for users is a web interface that spins up one container per user and allows independent replay of streams. This talk will focus on the architecture and on technical details concerning its implementation as well as how docker is utilized to process, store and visualize events. Some time will be spent explaining details about custom made docker solutions.
Turbo code applications : a journey from a paper to realization - University Of Pikeville
The scale and magnitude of computing and data have proven to increase significantly in the last decade, thus making data delivery methods to the world a herculean research problem by itself. In addition to this, the time and efforts invested by a user in analyzing and peer-reviewing a research article is very laborious.
Literature shows numerous outstanding climate studies published in International climate assessment reports, such as the Intergovernmental Panel on Climate Change IPCC , the United Nations body for assessing the science related to climate change. The need to verify the research and make it reproducible and transparent before it gets translated into major decisions is, now more than ever, one of our most critical challenges. In this presentation, we will paint a picture of the history of climate computing and analytics with significant transformations applied in order to make meaningful, quantifiable, credible, interoperable, accessible and reusable climate research.
In other words, we will draw a path towards reproducible research using Docker containers for massive data publishing and climate analytics. Speaker: Aparna Radhakrishnan, Engility. Transforming Education for the Next Generation of Software Engineers How Holberton uses Docker containers to train Software Engineers, providing a high-quality and bias-free education, at scale.
The recent surge of Kubernetes and GitLab-CI gave us all a fascinating new toolkit which enables me to test, build and deploy more of my content directly on the Internet; let me help you get your projects up and running on Docker Hub and GitHub!
Speaker: Rich Braun, Splunk. Our lives and jobs though, today as never before, require us to use and write scripts and softwares. This is particularly true in an environment devoted to research and innovation as academia is. Portability can be a true nightmare when your very high specialization is as far from dev as it can be. Docker has been introduced in academia not just for how powerful, flexible and adaptable is, but first of all for its simplicity. Having given training about docker to biologist and biotechnologists, though the concept of containerisation is initially hard to grasp, what won their hearts was the simplicity and the velocity with which they could pick it up and exploit it in their daily lives.
It is worth mentioning and reminding ourselves as well of those situations in which broadband speed and reliability is not always a matter of course: here as well Docker provides some nice, often overlooked, functionalities.
New / Trial Databases
The life science community has been drawn in the recent past to bioinformatics and computational biology, new fields of research that only very recently found their space in education. While new researchers are often trained in coding and data analysis, the whole community benefits from tools and platforms that lower the length of time to get results. Docker has now an important place in a number of projects, allowing, like a virtual machine with installed software, reproducibility of analysis, a major concern in science, while at the same time being easily portable, which is essential for collaborations.
Even more, scientists can now share their code, packaging and distributing it themselves with very little effort. Speaker: Alice Minotto, Earlham Institute. Customer Case Studies Digital Transformation Readiness - A Docker Journey at Halliburton Digital technology is a business enabler at Halliburton, an energy services company that engineers solutions to maximize asset value for its customers. From remote locations around the world, to central data centers and ubiquitous public clouds, a digital transformation is underway with distributed computing as the new norm.
This session will cover an IT enterprise architecture perspective of Halliburton's containerization journey and why Docker is a considered a key enabler for digital transformation. In the session, you'll learn: Architectural vision for Halliburton's distributed computing platform Why Halliburton selected Docker Enterprise Edition Halliburton's Journey with Docker what was easy, hard, and really difficult Speaker: Torben Pedersen, Halliburton. From Swarm to Kubernetes and Back Again Companies across every industry are rapidly realizing the value of microservices, and the solutioning conducted in support of that revelation often leads to containers.
Along that path comes a decision that most IT engineers in this space are familiar with: "What orchestrator do we use? Additionally, he'll cover how many operational and development problems can be solved with any orchestrator, and why Citizens Bank chose the path they did. As a result, containerization was fast-tracked as the solution that can help them with the various tenants of their strategy: hyperconvergence, SaaS ServiceNow , and workload portability.
Docker Enterprise proved to be the right solution to migrate roughly legacy applications from Windows Server and to Windows Server quickly, securely and economically. Entergy IT has now delivered the ability for the business to run applications on-premise, in the cloud, and future-proofed the applications for migration to new versions of Windows Server. In this session, Entergy will talk about how they are modernizing their infrastructure to become more agile, secure, and enable workload portability. Failure of either one of these VMs would mean an outage for one or all of the hosted sites.
WES began their investigation in to Docker to address issues of fault tolerance, consistency, and portability. This increased their confidence in deployments and reduced the need for maintenance windows. Through the use of well-defined workflows and persistent storage, applications are continually redeployed and restored between environments with zero downtime and no loss of data. Additionally developers can pull down and run any of the sites independently with configuration that matches production. Join this sessions to learn about the challenges and triumphs that Wiley faced when orchestrating CMS deployments in Docker!
Modernizing Insurance with Docker Enterprise: The Physicians Mutual Technology Journey Physicians Mutual, a year-old Nebraska-based insurance company, had worked on modernizing its systems for over a decade. In such a complex industry, any IT refresh can seem like a never-ending journey.
The existing application architecture made it difficult to scale or refresh applications individually. In , the company piloted a microservice architecture and an automated pipeline on Docker Engine CE to deliver the new corporate API. Speaker: Nathan Coberly, Physicians Mutual. Modernizing Microsoft. NET Applications Many enterprises have a large product portfolio of custom. In most cases, these legacy architecture based applications can benefit from containerization. Containerization can increase product quality, portability, and testability while enhancing security, increasing hardware utilization efficiency, enabling adoption of devops practices and techniques, all while supporting an evolutionary re-architecture strategy.
With a bit of refactoring we can also move towards a hybrid hosting modality, simplifying the transition to public cloud providers and enabling DR savings. NET WebForms components. The previous company had been winding down for years so server and software upgrades had not been on the radar for some time. That's when I started my adventure with Jireh in September with a charter to modernize the applications running the manufacturing facility process and move them into VMs with no impact to manufacturing. That led me down a path of exploration and questions. And do it without manufacturing downtime.
Related Turbo Code Applications: A Journey from a Paper to Realization
Copyright 2019 - All Right Reserved