Tuesday, November 10, 2009

Microsoft Releases Silverlight 3

Here is an update for Silverlight, Allready published in this blog (Moonlight and Silverlight plug-in).
Microsoft has always been interested in developing the Silverlight for the last two years now but recently Microsoft released the latest version Silverlight 3.

What Is Silverlight?
Silverlight is a plug-in for delivering media experiences and interactive applications for the Web. Silverlight enables companies to design, develop, and deliver powerful applications and experiences for the web.
It works same as Adobe Flash and with Mac OS, Windows, Linux, and devices.
Silverlight was first introduced in 2007, the second version of which was released in September 2008 but the new and improved version Silverlight 3 was released in July 2009 with the feature called Smooth Streaming.
The free and 4MB tool will help you experience high-definition video playback right on your desktop.

Presentation Links

Click Here for Silverlight in Wikipedia

Thursday, September 24, 2009

HTML 5 -- The Future of the Web ?


Some have embraced it, some have discarded it as too far in the future, and some have abandoned a misused friend in favor of an old flame in preparation. Whatever side of the debate you’re on, you’ve most likely heard all the blogging chatter surrounding the “new hotness” that is HTML5. It’s everywhere, it’s coming, and you want to know everything you can before it’s old news.

Things like jQuery plugins, formatting techniques, and design trends change very quickly throughout the Web community. And for the most part we’ve all accepted that some of the things we learn today can be obsolete tomorrow, but that’s the nature of our industry.

When looking for some stability, we can usually turn to the code itself as it tends to stay unchanged for a long time (relatively speaking). So when something comes along and changes our code, it’s a big deal; and there are going to be some growing pains we’ll have to work through. Luckily, rumor has it, that we have one less change to worry about.

In this article, I’m hoping to give you some tips and insight into HTML5 to help ease the inevitable pain that comes with transitioning to a slightly different syntax.

This specification evolves HTML and its related APIs to ease the authoring of Web-based applications. Additions include context menus, a direct-mode graphics canvas, a full duplex client-server communication channel, more semantics, audio and video, various features for offline Web applications, sandboxed iframes, and scoped styling. Heavy emphasis is placed on keeping the language backwards compatible with existing legacy user agents and on keeping user agents backwards compatible with existing legacy documents.

Presentation Links


Download PPT in MS OFFICE __ Click Here

Wikipedia Link __ Click Here

Click Here to Read all about HTML 5

Saturday, September 19, 2009

Sony 3D Television To Be Released In 2010 - A New Form Of Entertainment?

3D TV has been a promise stretching back into the 1950s, known as the Golden Age of 3D, when Disney, Paramount and Universal all had 3D film offerings and at the time felt the technology was the way of the future for video content. What killed that vision was the eye strain and headaches that usually resulted from an evening of viewing a 3D movie.

Fast forward a few technology cycles later, and 3D has come around again. For the past few years we've been seeing fledgling attempts at 3D TV from various manufacturers, but it seems like 2009 is the year all the majors are debuting versions of their own 3D technologies. We had a chance to get our oculars on Sony's working prototype and found the tech to be getting surprisingly close to prime time.

We're still not totally sold on 3D as the true next step of video content, but the clarity and depth of the trailers we saw were rather impressive. Gone are the ghosting and trail effects we've seen in the recent past, and the footage is no longer afflicted with errant afterimages and shadow outlines prominent especially during high motion sequences. The 3D content we saw on Sony's set was crystal clear and immersive.

We can't say we were entirely free of adverse effects after about 8 minutes of viewing. However, it's hard to say how much the 3D content versus a long couple of days of trade show coverage on little sleep was to blame. It definitely seems that the 3D trend is in full swing with Sony, Samsung, LG, Mitsubishi, Panasonic and Philips all showing off 3D displays on the show floor this year; many of them claim the technology will be in consumer homes as early as next year.

Presentation Links

Click Here to download PPt in pdf

Click Here to download 3D TV tools pdf

Data Warehousing with Oracle Database 11g Release 2


As business operations become more complex, the demand f or change in IT increases accordingly,complete with associated risks that must be mitigated. Today's IT profes sionals are being asked to manage more information, and deliver that information to their users, with ever increasing quality of service, in a timely manner. And, in today's economic climate, IT is additionally tasked with reducing budgets and deriving greater value out of its existing investments.

Oracle Database 11g Release 2 provides the foundation for IT to successfully deliver more information with higher quality of service, reduce the risk of

change within IT, and make more efficient use of their IT budgets. By deploying Oracle Database 11g Release 2 as their data management foundation, organizations can utilize the full power of the world's leading database to:

  • Reduce server costs by a factor of 5
  • Reduce storage requirements by a factor of 12
  • Improve mission critical systems performance by a factor of 10
  • Increase DBA productivity by a factor of 2
  • Eliminate idle redundancy in the data center, and
  • Simplify their overall IT software portfolio.
Presentation Links

Click here
to get official review from ORACLE in pdf

Click here
to download PPT in MS 2007

Saturday, August 29, 2009

Wolfram Alpha : For a Better Search

Wolfram Alpha (also written as WolframAlpha and Wolfram|Alpha) is an answer engine developed by Wolfram Research. It is an online service that answers factual queries directly by computing the answer from structured data, rather than providing a list of documents or web pages that might contain the answer as a search engine might. It was announced in March 2009 by Stephen Wolfram, and was released to the public on May 15, 2009.

A Computational Knowledge Engine for the Web

In a nutshell, Wolfram and his team have built what he calls a “computational knowledge engine” for the Web. OK, so what does that really mean? Basically it means that you can ask it factual questions and it computes answers for you.

It doesn’t simply return documents that (might) contain the answers, like Google does, and it isn’t just a giant database of knowledge, like the Wikipedia. It doesn’t simply parse natural language and then use that to retrieve documents, like Powerset, for example. Instead, Wolfram Alpha actually computes the answers to a wide range of questions — like questions that have factual answers such as “What country is Timbuktu in?” or “How many protons are in a hydrogen atom?” or “What is the average rainfall in Seattle?”

Think about that for a minute. It computes the answers. Wolfram Alpha doesn’t simply contain huge amounts of manually entered pairs of questions and answers, nor does it search for answers in a database of facts. Instead, it understands and then computes answers to certain kinds of questions.

How Does it Work?

Wolfram Alpha is a system for computing the answers to questions. To accomplish this it uses built-in models of fields of knowledge, complete with data and algorithms, that represent real-world knowledge.

For example, it contains formal models of much of what we know about science — massive amounts of data about various physical laws and properties, as well as data about the physical world.Based on this you can ask it scientific questions and it can compute the answers for you. Even if it has not been programmed explicity to answer each question you might ask it.

But science is just one of the domains it knows about — it also knows about technology, geography, weather, cooking, business, travel, people, music, and more.

It also has a natural language interface for asking it questions. This interface allows you to ask questions in plain language, or even in various forms of abbreviated notation, and then provides detailed answers.The vision seems to be to create a system wich can do for formal knowledge (all the formally definable systems, heuristics, algorithms, rules, methods, theorems, and facts in the world) what search engines have done for informal knowledge (all the text and documents in various forms of media).

Building Blocks for Knowledge Computing

Wolfram Alpha is almost more of an engineering accomplishment than a scientific one — Wolfram has broken down the set of factual questions we might ask, and the computational models and data necessary for answering them, into basic building blocks — a kind of basic language for knowledge computing if you will. Then, with these building blocks in hand his system is able to compute with them — to break down questions into the basic building blocks and computations necessary to answer them, and then to actually build up computations and compute the answers on the fly.

Wolfram’s team manually entered, and in some cases automatically pulled in, masses of raw factual data about various fields of knowledge, plus models and algorithms for doing computations with the data. By building all of this in a modular fashion on top of the Mathematica engine, they have built a system that is able to actually do computations over vast data sets representing real-world knowledge. More importantly, it enables anyone to easily construct their own computations — simply by asking questions.

The scientific and philosophical underpinnings of Wolfram Alpha are similar to those of the cellular automata systems he describes in his book, “A New Kind of Science” (NKS). Just as with cellular automata (such as the famous “Game of Life” algorithm that many have seen on screensavers), a set of simple rules and data can be used to generate surprisingly diverse, even lifelike patterns. One of the observations of NKS is that incredibly rich, even unpredictable patterns, can be generated from tiny sets of simple rules and data, when they are applied to their own output over and over again.

In fact, cellular automata, by using just a few simple repetitive rules, can compute anything any computer or computer program can compute, in theory at least. But actually using such systems to build real computers or useful programs (such as Web browsers) has never been practical because they are so low-level it would not be efficient (it would be like trying to build a giant computer, starting from the atomic level).

The simplicity and elegance of cellular automata proves that anything that may be computed — and potentially anything that may exist in nature — can be generated from very simple building blocks and rules that interact locally with one another. There is no top-down control, there is no overarching model. Instead, from a bunch of low-level parts that interact only with other nearby parts, complex global behaviors emerge that, for example, can simulate physical systems such as fluid flow, optics, population dynamics in nature, voting behaviors, and perhaps even the very nature of space-time. This is the main point of the NKS book in fact, and Wolfram draws numerous examples from nature and cellular automata to make his case.

But with all its focus on recombining simple bits of information and simple rules, cellular automata is not a reductionist approach to science — in fact, it is much more focused on synthesizing complex emergent behaviors from simple elements than in reducing complexity back to simple units. The highly synthetic philosophy behind NKS is the paradigm shift at the basis of Wolfram Alpha’s approach too. It is a system that is very much “bottom-up” in orientation.

Wolfram has created a set of building blocks for working with formal knowledge to generate useful computations, and in turn, by putting these computations together you can answer even more sophisticated questions and so on. It’s a system for synthesizing sophisticated computations from simple computations. Of course anyone who understands computer programming will recognize this as the very essence of good software design. But the key is that instead of forcing users to write programs to do this in Mathematica, Wolfram Alpha enables them to simply ask questions in natural language questions and then automatically assembles the programs to compute the answers they need.

This is not to say that Wolfram Alpha IS a cellular automata itself — but rather that it is similarly based on fundamental rules and data that are recombined to form highly sophisticated structures. The knowledge and intelligence it contains are extremely modularized and can be used to synthesize answers to factual questions nobody has asked yet. The questions are broken down to their basic parts and then simple reasoning takes places, and answers are computed on the vast knowledge base in the system. It appears the system can make inferences and do some basic reasoning across what it knows — it is not purely reductionist in that respect; it is generative, it can synthesize new knowledge, if asked to.

Wolfram Alpha perhaps represents what may be a new approach to creating an “intelligent machine” that does away with much of the manual labor of explicitly building top-down expert systems about fields of knowledge (the traditional AI approach, such as that taken by the Cyc project), while simultaneously avoiding the complexities of trying to do anything reasonable with the messy distributed knowledge on the Web (the open-standards Semantic Web approach). It’s simpler than top down AI and easier than the original vision of Semantic Web.

Generally if someone had proposed doing this to me, I would have said it was not practical. But Wolfram seems to have figured out a way to do it. The proof is that he’s done it. It works. I’ve seen it myself.

Presentation Links

Click Here to download PowerPoint presentation in MS OFFICE 07

Click Here for Wikipedia Link

Click Here to Tech Crunch Review On Wolfram

Thursday, August 13, 2009

USB 3.0 Speeds Up Performance on External Devices

The USB connector has been one of the greatest success stories in the history of computing, with more than 2 billion USB-connected devices sold to date. But in an age of terabyte hard drives, the once-cool throughput of 480 megabits per second that a USB 2.0 device can realistically provide just doesn't cut it any longer.

What is it? USB 3.0 (aka "SuperSpeed USB") promises to increase performance by a factor of 10, pushing the theoretical maximum throughput of the connector all the way up to 4.8 gigabits per second, or processing roughly the equivalent of an entire CD-R disc every second. USB 3.0 devices will use a slightly different connector, but USB 3.0 ports are expected to be backward-compatible with current USB plugs, and vice versa. USB 3.0 should also greatly enhance the power efficiency of USB devices, while increasing the juice (nearly one full amp, up from 0.1 amps) available to them. That means faster charging times for your iPod--and probably even more bizarre USB-connected gear like the toy rocket launchers and beverage coolers that have been festooning people's desks.

When is it coming? The USB 3.0 spec is nearly finished, with consumer gear now predicted to come in 2010. Meanwhile, a host of competing high-speed plugs--DisplayPort, eSATA, and HDMI--will soon become commonplace on PCs, driven largely by the onset of high-def video. Even FireWire is looking at an imminent upgrade of up to 3.2 gbps performance. The port proliferation may make for a baffling landscape on the back of a new PC, but you will at least have plenty of high-performance options for hooking up peripherals.

Presentation Links

Click Here to download The PPT in MSOFFICE 07

Wikipedia Link Click Here

Memristor: A Groundbreaking New Circuit


Photograph: Courtesy of HP

Since the dawn of electronics, we've had only three types of circuit components--resistors, inductors, and capacitors. But in 1971, UC Berkeley researcher Leon Chua theorized the possibility of a fourth type of component, one that would be able to measure the flow of electric current: the memristor. Now, just 37 years later, Hewlett-Packard has built one.

What is it? As its name implies, the memristor can "remember" how much current has passed through it. And by alternating the amount of current that passes through it, a memristor can also become a one-element circuit component with unique properties. Most notably, it can save its electronic state even when the current is turned off, making it a great candidate to replace today's flash memory.

Memristors will theoretically be cheaper and far faster than flash memory, and allow far greater memory densities. They could also replace RAM chips as we know them, so that, after you turn off your computer, it will remember exactly what it was doing when you turn it back on, and return to work instantly. This lowering of cost and consolidating of components may lead to affordable, solid-state computers that fit in your pocket and run many times faster than today's PCs.
Someday the memristor could spawn a whole new type of computer, thanks to its ability to remember a range of electrical states rather than the simplistic "on" and "off" states that today's digital processors recognize. By working with a dynamic range of data states in an analog mode, memristor-based computers could be capable of far more complex tasks than just shuttling ones and zeroes around.
When is it coming? Researchers say that no real barrier prevents implementing the memristor in circuitry immediately. But it's up to the business side to push products through to commercial reality. Memristors made to replace flash memory (at a lower cost and lower power consumption) will likely appear first; HP's goal is to offer them by 2012. Beyond that, memristors will likely replace both DRAM and hard disks in the 2014-to-2016 time frame. As for memristor-based analog computers, that step may take 20-plus years.

Presentation Links

Click here to download PPT in MS OFFICE 07

Wikipedia Link Click here

Saturday, August 8, 2009

Conficker (Downadup) This Time More Intelligent

Conficker, the worm that exploded into prominence in January when it infected millions of machines, exploiting an already-patched bug in Windows that Microsoft had thought dire enough to fix outside its usual update schedule. The worm hijacked a large number of PCs - estimates ranged as high as 12 million at one point - and then assembled them into a massive botnet able to spread malware, plant fake antivirus software or distribute huge amounts of spam.

It all started in late-October of 2008, we began to receive reports of targeted attacks taking advantage of an as-yet unknown vulnerability in Window’s remote procedure call (RPC) service. Microsoft quickly released an out-of-band security patch (MS08-067), going so far as to classify the update as “critical” for some operating systems—the highest designation for a Microsoft Security Bulletin.

It didn’t take long for malware authors to utilize this vulnerability in their malicious code. In early November, W32.Kernelbot.A and W32.Wecorlappeared, demonstrating limited success in exploiting MS08-067. Still much of the talk at the time focused on the potential havoc that could be caused by this vulnerability, as opposed to damage caused by these threats.

It wasn’t until late November that W32.Downadup appeared (also called Conficker by some news agencies and antivirus vendors). This threat achieved modest propagation success, partly due to its borrowing from the work of the Metasploit Project, which had been actively developing more reliable proof-of-concept methods to take advantage of the vulnerability.

This was one thing that set the Downadup worm apart from its counterparts of the last few years its technical versatility. These secondary tricks weren’t new; there were just so many of them. It scanned the network for vulnerable hosts, but didn’t flood it with traffic, selectively querying various computers in an attempt at masking its traffic instead. It attempted to brute force commonly used network passwords. It took advantage of Universal Plug and Play to pass through routers and gateways. And when the network proved too secure, it used a rather clever AutoPlay trick to get users to execute it from removable drives.

The threat even protected itself from takeover. Transferred payload files were encrypted, as well as digitally signed, and only the Downadup authors had the key. A “hot patching” routine for MS08-067 prevented further exploitation by other attackers or threats. The threat’s authors went to great lengths to prevent buffer overflow exploitation of their own code. No one was going to hijack this worm’s network of potential bots.

But the hidden danger behind all of this was the potential payload Downadup contained the ability to update itself or receive additional files for execution. Again, not a new technique, but in this case the threat was generating a list of 250 new domains to connect to every day. Any one of these domains could potentially contain an update that, if downloaded, would allow the threat to perform further malicious actions. What sort of actions? Anything the authors wanted really. Not only that, but the threat contained its own peer-to-peer (P2P) updating mechanism, allowing one infected computer to update another. Blocking access to the domains might protect you from one vector, but blocking a P2P update is a different matter.

Infection rates began to decline in mid-February, as news of the threat spread and network administrators that had not applied MS08-067 scrambled to protect their networks. This could be in part due to the fact that, with propagation success, the threat garnered a significant amount of attention. The domains meant to update the threat were being closely watched by security vendors, researchers, and the media. This attention isn’t surprising given the threat’s complexity and the way it has utilized so many time-tested malicious code tricks, reinventing a few along the way. There is no doubt that Downadup has been the hot topic for 2009.

Presentation Links

Click Here to download the PPT in MS OFFICE 07

Wikipedia Link Click Here

Wednesday, July 29, 2009

New Robot With Artificial Skin To Improve Human Communication

Work is beginning on a robot with artificial skin which is being developed as part of a project involving researchers at the University of Hertfordshire so that it can be used in their work investigating how robots can help children with autism to learn about social interaction.Kaspar.Credit: Image courtesy of University of Hertfordshire

Professor Kerstin Dautenhahn and her team at the University’s School of Computer Science are part of a European consortium, which is working on the three-year Roboskin project to develop a robot with skin and embedded tactile sensors.

According to the researchers, this is the first time that this approach has been used in work with children with autism.

The researchers will work on Kaspar (http://kaspar.feis.herts.ac.uk/), a child-sized humanoid robot developed by the Adaptive Systems research group at the University. The robot is currently being used by Dr. Ben Robins and his colleagues to encourage social interaction skills in children with autism. They will cover Kaspar with robotic skin and Dr Daniel Polani will develop new sensor technologies which can provide tactile feedback from areas of the robot’s body. The goal is to make the robot able to respond to different styles of how the children play with Kaspar in order to help the children to develop ‘socially appropriate’ playful interaction (e.g. not too aggressive) when interacting with the robot and other people.

“Children with autism have problems with touch, often with either touching or being touched,” said Professor Kerstin Dautenhahn. “The idea is to put skin on the robot as touch is a very important part of social development and communication and the tactile sensors will allow the robot to detect different types of touch and it can then encourage or discourage different approaches.”

Roboskin is being co-ordinated by Professor Giorgio Cannata of Università di Genova (Italy). Other partners in the consortium are: Università di Genova, Ecole Polytechnique Federale Lausanne, Italian Institute of Technology, University of Wales at Newport and Università di Cagliari.



Tuesday, July 28, 2009

LTE Technology

LTE (Long Term Evolution) is the last step toward the 4th generation of radio technologies designed to increase the capacity and speed of mobile telephone networks. Where the current generation of mobile telecommunication networks are collectively known as 3G, LTE is marketed as and called 4G, although technically it is 3.9G. Most major mobile carriers in the United States and several worldwide carriers have announced plans to convert their networks to LTE beginning in 2009. LTE is a set of enhancements to the Universal Mobile Telecommunications System (UMTS) which will be introduced in 3rd Generation Partnership Project (3GPP) Release 8. Much of 3GPP Release 8 will focus on adopting 4G mobile communications technology, including an all-IP flat networking architecture.

LTE Advanced is a mobile communication standard. It is currently being standardized by the 3rd Generation Partnership Project (3GPP) as a major enhancement of 3GPP Long Term Evolution. LTE (Long Term Evolution) standardization has come to a mature state by now where changes in the specification are limited to corrections and bug fixes. LTE mobile communication systems are expected to be deployed from 2010 onwards as a natural evolution of Global system for mobile communications (GSM) and Universal Mobile Telecommunications System (UMTS).

Presentation Links

Click here to download Motorola White Paper (Presentation Paper with Abstract) on LTE

Click on the link to download ppt shot (1) and ppt shot (2)

Wikipedia Links Click Here (LTE 3GPP)
Click Here (LTE Advanced)

Sunday, July 26, 2009

Google Wave



Google Wave is "a personal communication and collaboration tool" announced by Google at the Google I/O conference on May 27, 2009. It is a web based service, computing platform, and communications protocol designed to merge e-mail, instant messaging, wiki, and social networking.It has a strong collaborative and real-time focus supported by robust spelling/grammar checking, automated translation between 40 languages, and numerous other extensions. It was announced in Google's official blog on Tuesday 20 July 2009 that the preview of Google Wave will be extended to about 100 000 users on September 30, 2009 At this point, the current view of the Google Wave Website (which is a preview with the Keynote embedded), will be replaced with wave itself.

Presentation Links

Presentation Video Link for Developers Click Here

Wikipedia Link Click Here


Saturday, July 25, 2009

aVsSEMINARs Launched

Hai Friends,

Our new blog platform for seminar topics only started.you can get all topics which are developed in last 5- 7 years in www.avsseminar.blogspot.com . All the topics includes presentations also.But latest posts in aVsonline are not available in aVsseminar.So for latest news and technical topics log on to aVsonline

Thursday, July 23, 2009

Eye-Fi adds Wi-Fi to any Digital Camera!


First your phone went wireless, then your laptop, now finally your camera!
Never scrounge around for a USB cable again! Eye-fi is a magical orange SD memory card that will not only store 2GB worth of pictures, it'll upload them to your computer, and to Flickr, Facebook, Picasa (or 14 others) wirelessly, invisibly, automatically!
This little guy looks like a normal 2GB memory card and works with nearly any camera that takes SD memory. There are no antennas, no protrusions, no subscription fees, and no cables.
Here's how it works: You set up the card once with the included USB card reader (tell it which wireless network it should use, and type in the password if you have one), choose the photo sharing service of your choice (you have plenty of options), then slip the card in your camera.
From then on, you never have to touch anything. Just take photos. Whenever your cameras near the wireless network you selected and idle, Eye-fi will upload all your photos (JPEGs only) to your online photo sharing service. Next time your computer's online, they'll download there, too!
Yes, it is practically a magic.
Eye-Fi SD card comes in three varieties:
1). Eye-Fi Home ($79.99)
2). Eye-Fi Share ($99.99)
3). Eye-Fi Explore ($129.99)

Trend links

Wikipedia Link Click Here

The Morph concept __NOKIA

Morph Wrist mode

Launched alongside The Museum of Modern Art “Design and The Elastic Mind” exhibition, the Morph concept device is a bridge between highly advanced technologies and their potential benefits to end-users. This device concept showcases some revolutionary leaps being explored by Nokia Research Center (NRC) in collaboration with the Cambridge Nanoscience Centre (United Kingdom) – nanoscale technologies that will potentially create a world of radically different devices that open up an entirely new spectrum of possibilities.

Morph concept technologies might create fantastic opportunities for mobile devices:

  • Newly-enabled flexible and transparent materials blend more seamlessly with the way we live
  • Devices become self-cleaning and self-preserving
  • Transparent electronics offering an entirely new aesthetic dimension
  • Built-in solar absorption might charge a device, whilst batteries become smaller, longer lasting and faster to charge
  • Integrated sensors might allow us to learn more about the environment around us, empowering us to make better choices
In addition to the advances above, the integrated electronics shown in the Morph concept could cost less and include more functionality in a much smaller space, even as interfaces are simplified and usability is enhanced. All of these new capabilities will unleash new applications and services that will allow us to communicate and interact in unprecedented ways.Flexible & Changing Design

Nanotechnology enables materials and components that are flexible, stretchable, transparent and remarkably strong. Fibril proteins are woven into a three dimensional mesh that reinforces thin elastic structures. Using the same principle behind spider silk, this elasticity enables the device to literally change shapes and configure itself to adapt to the task at hand.

A folded design would fit easily in a pocket and could lend itself ergonomically to being used as a traditional handset. An unfolded larger design could display more detailed information, and incorporate input devices such as keyboards and touch pads.

Even integrated electronics, from interconnects to sensors, could share these flexible properties. Further, utilization of biodegradable materials might make production and recycling of devices easier and ecologically friendly.

Self-Cleaning

Nanotechnology also can be leveraged to create self-cleaning surfaces on mobile devices, ultimately reducing corrosion, wear and improving longevity. Nanostructured surfaces, such as “Nanoflowers” naturally repel water, dirt, and even fingerprints utilizing effects also seen in natural systems.

Advanced Power Sources

Nanotechnology holds out the possibility that the surface of a device will become a natural source of energy via a covering of “Nanograss” structures that harvest solar power. At the same time new high energy density storage materials allow batteries to become smaller and thinner, while also quicker to recharge and able to endure more charging cycles.

Sensing The Environment

Nanosensors would empower users to examine the environment around them in completely new ways, from analyzing air pollution, to gaining insight into bio-chemical traces and processes. New capabilities might be as complex as helping us monitor evolving conditions in the quality of our surroundings, or as simple as knowing if the fruit we are about to enjoy should be washed before we eat it. Our ability to tune into our environment in these ways can help us make key decisions that guide our daily actions and ultimately can enhance our health.

Links for Nokia morph

Click here to view the video (.mov file 46mb size)

For more Pictures n Views Click here

Click here for Nokia Press Release

Wikipedia Link Click here

Sunday, July 19, 2009

Ambient Intelligence

Defined by the EC Information Scociety Technologies Advisory Group in a vision of the Information Society, Ambient Intelligence emphasises on greater user-friendliness, more efficient services support, user-empowerment, and support for human interactions. In this vision, people will be surrounded by intelligent and intuitive interfaces embedded in everyday objects around us and an environment recognising and responding to the presence of individuals in an invisible way by year 2010.
Ambient Intelligence builds on three recent key technologies: Ubiquitous Computing, Ubiquitous Communication and Intelligent User Interfaces – some of these concepts are barely a decade old and this reflects on the focus of current implementations of AmI (more on this later on). Ubiquitous Computing means integration of microprocessors into everyday objects like furniture, clothing, white goods, toys, even paint. Ubiquitous Communication enables these objects to communicate with each other and the user by means of ad-hoc and wireless networking. An Intelligent User Interface enables the inhabitants of the AmI environment to control and interact with the environment in a natural (voice, gestures) and personalised way (preferences, context).

Making AmI real is no easy task: as it commonly takes place with a new technology, soon after high-flying visions we are demonstrated with the first pieces of hardware for the intelligent environment. However, making a door knob able to compute and communicate does not make it intelligent: the key (and challenge) to really adding wit to the environment lies in the way how the system learns and keeps up to date with the needs of the user by itself. A thinking machine, you might conclude – not quite but close: if you rely on the intelligent environment you expect it to operate correctly every time without tedious training or updates and management. You might be willing to do it once but not constantly even in the case of frequent changes of objects, inhabitants or preferences in the environment. A learning machine, I'll say.

The following articles in this special theme issue showcase the various aspects of AmI research in Europe. In addition to background information on AmI related activities within the ERCIM members we have a number of articles on the infrastructure for AmI environments followed with algorithms adding some of the intelligence required to reach our goal for 2010.

Presentation Links

Click here to download PowerPoint presentation

Wikipedia Link Click here

ISTAG; Scenarios for Ambient Intelligence in 2010; Final Report,
Click here

Saturday, July 18, 2009

Moonlight (runtime) Plugin

Moonlight is an open source implementation of the Silverlight browser plug-in, based on Mono (an open source implementation of .NET).

Moonlight is being jointly developed by Microsoft and Novell to:

  • allow Silverlight applications to run on Linux
  • offer a Linux SDK (software development kit) for Silverlight applications
  • use the existing Silverlight engine to develop desktop applications.

Like Silverlight, Moonlight manifests as a runtime environment for browser-based rich Internet applications (RIAs) and, similarly, adds to animation, video playback and vector graphics capabilities. Developers are also creating desktop widgets called "desklets" to extend Moonlightapplications beyond the browser.

Presentation links

Click here to download Power Point Presentation

Wikipedia Link Click here

Storage Area Network

The Storage Network Industry Association (SNIA) defines the SAN as a network whose primary purpose is the transfer of data between computer systems and storage elements. A SAN consists of a communication infrastructure, which provides physical connections; and a management layer, which organizes the connections, storage elements, and computer systems so that data transfer is
secure and robust. The term SAN is usually (but not necessarily) identified with block I/O services rather than file access services. A SAN can also be a storage system consisting of storage elements, storage
devices, computer systems, and/or appliances, plus all control software,
communicating over a network.

A SAN allows “any-to-any” connection acrossthe network, using interconnect elements such as routers, gateways, hubs, switches and directors. It eliminates the traditional dedicated connection between a server and storage, and the concept that the server effectively “owns and manages” the storage devices. It also eliminates any restriction to the amount of data that a server can access, currently limited by the number of storage devices attached to the individual server. Instead, a SAN introduces the
flexibility of networking to enable one server or many heterogeneous servers to share a common storage utility, which may comprise many storage devices, including disk, tape, and optical storage. Additionally, the storage utility may be located far from the servers that use it.
The SAN can be viewed as an extension to the storage bus concept, which enables storage devices and servers to be interconnected using similar elements as in local area networks (LANs) and wide area networks (WANs): Routers, hubs, switches, directors, and gateways. A SAN can be shared between servers
and/or dedicated to one server. It can be local, or can be extended over geographical distances.

Presentation Links

Click Here to download the Power point presentation

Wikipedia Link click here

SPINS -Security Protocol For Sensor Network

s sensor networks edge closer towards wide-spread deployment, security issues become a central concern. Sensor networks have been identified as being useful in a variety of domains to include the battlefield and perimeter defense. So far, much research has focused on making sensor networks feasible and useful, and has not concentrated on security.

We present a suite of security building blocks optimized for resource constrained environments and wireless communication. SPINS has two secure building blocks: SNEP and ήTESLA SNEP provides the following important baseline security primitives: Data confidentiality, two-party data authentication, and data freshness.

A particularly hard problem is to provide efficient broadcast authentication, which is an important mechanism for sensor networks. ήTESLA is a new protocol which provides authenticated broadcast for severely resource-constrained environments. We implemented the above protocols, and show that they are practical even on minimal hardware: the performance of the protocol suite easily matches the data rate of our network. Additionally, we demonstrate that the suite can be used for building higher level protocols. ..

Presentation Links

Click here to download The power point presentation

For pdf infofile Click here

Wednesday, July 15, 2009

Cloud Computing

what is cloud computing?
Today, a growing number of businesses rely on the delivery of IT infrastructure and
applications over the Internet (or “the cloud”) to cost-effectively provide various IT
applications. Couple that with advancements in virtualization technology, expanding
bandwidth and the need to cut costs — and you can sense a fundamental shift in the
way many businesses approach IT software and hardware investments.
Intel defnes cloud computing as a computing model where services and data reside in
shared resources in scalable data centers. Any authenticated device over the Internet
can access those services and data.
The cloud has three components:
1 Cloud architecture: Services and data reside in shared, dynamically scalable resource
pools, based on virtualization technologies and/or scalable application environments.
2 Cloud service: The service delivered to enterprises over the Internet sits on cloud
architecture and scales without user intervention. Companies typically bill monthly
for service based on usage.
3 Private cloud: Cloud architecture is deployed behind an organization’s firewall
for internal use as IT-as-a-service.

Presentation Links

Click here to download Microsoft's PowerPoint presentation

For more Click here to download the pdf document from Intel

Wikipedia link Click here

Tuesday, July 14, 2009

Microsoft Office 2010 is Coming

Microsoft Office 2010, codenamed Office 14, is the successor of Microsoft Office 2007, a productivity suite for Microsoft Windows. Extended file compatibility, user interface updates, and a refined user experience are planned for Office 2010. A 64-bit version of Office 2010 will be available. It will be available for Windows XP SP3, Windows Vista and Windows 7. Microsoft plans to release Office 2010 in the second half of 2010.

Microsoft Office 2010, as revealed by the just-released Technical Preview, brings a set of important if incremental improvements to the market-leading office suite. Among them: making the Ribbon the default interface for all Office applications, adding a host of new features to individual applications such as video editing in PowerPoint and improved mail handling in Outlook and introducing a number of Office-wide productivity enhancers, including photo editing tools and a much-improved paste operation.

Missing from the Technical Preview is what will be the most important change to Office in years -- a Web-based version for both enterprises and consumers. Also missing from the preview is access to Office for mobile phones and other mobile clients. Those features will be introduced in later versions of the software; the final version is expected to ship in the first half of 2010

Technical Review Link

Click here to download the technical link for ms office 2010

Wikipedia link Click here

Monday, July 13, 2009

IEEE 802.11n –Next Generation Wireless Standard

The newest standard in Wireless LAN is called 802.11n. 802.11 is an industry standard for high-speed networking. 802.11n is designed to replace the 802.11a, 802.11b and 802.11g standards. 802.11n equipment is backward compatible with older 802.11gab and it supports much faster wireless connections over longer distances. So-called “Wireless N” or “Draft N” routers available today are based on a preliminary version of the 802.11n. The beta version of this standard is used now in laptops and routers. 802.11n will work by utilizing multiple input multiple output (MIMO) antennas and channel bounding in tandem to transmit and receive data. It contains at least 2 antennas for transmitting data’s. 802.11n will support bandwidth greater than 100 Mbps and in theory it can have a speed of 600 Mbps.It can be used in high speed internets, VOIP, Network Attach Storage (NAS), gaming. The full version will be implemented in the laptops and in the LANs in upcoming years.


Presentation Links

Please Click here to download the Microsoft's Powerpoint presentation

Wikipedia Link Click here

Inferno OS

Inferno is an operating system for creating and supporting distributed services .The name of the operating system and of its associated programs, as well as of the company Vita Nuova Holding that produces it, were inspired by the litrary works of Dante Alighieri, particularly the Divine Comedy
Inferno runs in hosted mode under several different operating systems or natively on a range of hardware architectures. In each configuration the operating system presents the same standard interfaces to its applications. A communications protocol called Styx is applied uniformly to access both local and remote resources.
Applications are written in the type-safe Limbo programming language, whose binary representation is identical over all platforms.

A communications protocol called Styx is applied uniformly to access both local and remote resources, which applications use by calling standard file operations, open, read, write, and close. As of the fourth edition of Inferno, Styx is identical to Plan 9's newer version of its hallmark 9P protocol, 9P2000.

The name of the operating system and of its associated programs, as well as of the company Vita Nuova Holdings that produces it, were inspired by the literary works of Dante Alighieri, particularly the Divine Comedy.


Presentation Links

Click here to download the ppt presentation for Inferno OS

Wikipedia link is here please Click here

Bayesian network

A Bayesian network, belief network or directed acyclic graphical model is a probabilistic graphical model that represents a set of random variables and their conditional independencies via a directed acyclic graph (DAG). For example, a Bayesian network could represent the probabilistic relationships between diseases and symptoms. Given symptoms, the network can be used to compute the probabilities of the presence of various diseases.

Formally, Bayesian networks are directed acyclic graphs whose nodes represent variables, and whose missing edges encode conditional independencies between the variables. Nodes represent random variables, but in the Bayesian sense: they may be observable quantities, latent variables, unknown parameters or hypotheses. Efficient algorithms exist that perform inference and learning in Bayesian networks. Bayesian networks that model sequences of variables (e.g. speech signals or protein sequences) are called dynamic Bayesian networks. Generalizations of Bayesian networks that can represent and solve decision problems under uncertainty are called influence diagrams.

Presentation Links

Please click here to download Microsoft Powerpoint presentation

Wikipedia Link Click here

Sunday, July 12, 2009

3D optical data Storage

3D optical data storage is the term given to any form of optical data storage in which information can be recorded and/or read with three dimensional resolution (as opposed to the two dimensional resolution afforded, for example, by CD).

This innovation has the potential to provide terabyte-level mass storage on DVD-sized disks. Data recording and readback are achieved by focusing lasers within the medium. However, because of the volumetric nature of the data structure, the laser light must travel through other data points before it reaches the point where reading or recording is desired. Therefore, some kind of nonlinearity is required to ensure that these other data points do not interfere with the addressing of the desired point.

No commercial product based on 3D optical data storage has yet arrived on the mass market, although several companies are actively developing the technology and predict that it will become available by 2010.

Presentation Links

Click here for 3D volume storage in pdf

Wikipedia Link Click here


Green Dam For Youth Escort

Green Dam Youth Escort is content-control software developed in the People's Republic of China (PRC). Under a directive from the Ministry of Industry and Information Technology (MIIT) of the PRC taking effect on 1 July 2009, it is mandatory to have either the software, or its setup files pre-installed on, or shipped on a compact disc with, all new personal computers sold in Mainland China, including those imported from abroad. End users, however, are not under mandate to run the software.

As of 30 June 2009, the mandatory pre-installation of the Green Dam software on new computers has been delayed to an unknown date.However, Asian brands Sony, Acer, Asus, BenQ and Lenovo etc. are shipping the software as was originally ordered.

The buffer overflow flaw exists in the latest, patched version of Green Dam, 3.17, according to security researcher "Trancer," who claims authorship of the attack code.
"I wrote a Metasploit exploit module for Internet Explorer, which exploits this stack-based, buffer overflow vulnerability in Green Dam 3.17," Trancer wrote in his Recognize-Security blog. "I've tested this exploit successfully on the following platforms: IE6, Windows XP SP2, IE7, Windows XP SP3, Windows Vista SP1."

The attack code, which has been posted to the Milw0rm Web site for proof-of-concept exploits, has been circulating in the wild for a week, according to security consultant and ZDNet blogger Dancho Danchev.

The Chinese government has ordered Green Dam censorware, billed as a pornography filter, to come preinstalled on all PCs sold in the country beginning July 1. Jinhui Computer System Engineering, which produces the software, patched Green Dam after a team from the University of Michigan exposed a buffer overflow flaw in it.

Last week, the researchers said in an addendum to their original paper that despite this patch, the software remains vulnerable to buffer overflow attacks, which indicates that Green Dam's security problems "run deep."

Green Dam intercepts Internet traffic using a library called SurfGd.dll. Even after the patch, SurfGd.dll still uses a fixed-length buffer to process Web site requests, the researchers explained. Malicious Web sites could overrun this buffer to take control of the execution of applications on a target computer.

"The program now checks the lengths of the URL and individual HTTP request headers, but the sum of the lengths is erroneously allowed to be greater than the size of the buffer," wrote the researchers. "An attacker can compromise the new version by using both a very long URL and a very long 'Host' HTTP header. The pre-update version, 3.17, which we examined in our original report, is also susceptible to this attack."

Green Dam is also vulnerable to a blacklisting flaw, identified by University of Michigan researchers Scott Wolchok, Randy Yao, and J. Alex Halderman, which could allow third parties to upload malware via an innocuous-seeming update.

Presentation Link

Click here to download the presentation style in pdf

Wikipedia Link for greem Dam Click here

Green Computing

Green computing is the study and practice of using computing resources efficiently. The primary objective of such a program is to account for the triple bottom line, an expanded spectrum of values and criteria for measuring organizational (and societal) success. The goals are similar to green chemistry; reduce the use of hazardous materials, maximize energy efficiency during the product's lifetime, and promote recyclability or biodegradability of defunct products and factory waste.

Modern IT systems rely upon a complicated mix of people, networks and hardware; as such, a green computing initiative must be systemic in nature, and address increasingly sophisticated problems. Elements of such a solution may comprise items such as end user satisfaction, management restructuring, regulatory compliance, disposal of electronic waste, telecommuting, virtualization of server resources, energy use, thin client solutions, and return on investment (ROI).

The imperative for companies to take control of their power consumption, for technology and more generally, therefore remains acute. One of the most effective power management tools available in 2009 may still be simple, plain, common sense.

Presentation Links

Click here to download the presentation in pdf form

Wikipedia link for Green Computing Click here

E-waste management

Sorry guys this time ........It is not technology related.......I want to make something usefull to our society.I think this topic (e-waste management) is very valuable and make some impression on people since it is simple to present and we must aware about this problem

Electronic waste, e-waste, e-scrap, or Waste Electrical and Electronic Equipment (WEEE) describes loosely discarded, surplus, obsolete, broken, electrical or electronic devices. The processing of electronic waste in developing countries causes serious health and pollution problems due to the fact that electronic equipment contains some very serious contaminants such as lead, cadmium, beryllium and brominated flame retardants. Even in developed countries recycling and disposal of e-waste involves significant risk for examples to workers and communities and great care must be taken to avoid unsafe exposure in recycling operations and leaching of materials such as heavy metals from landfills and incincerator ashes.

Definition

"Electronic waste" may be defined as all secondary computers, entertainment device electronics, mobile phones, and other items such as television sets and refrigerators, whether sold, donated, or discarded by their original owners. This definition includes used electronics which are destined for reuse, resale, salvage, recycling, or disposal. Others define the reusables (working and repairable electronics) and secondary scrap (copper, steel, plastic, etc.) to be "commodities", and reserve the term "waste" for residue or material which was represented as working or repairable but which is dumped or disposed or discarded by the buyer rather than recycled, including residue from reuse and recycling operations. Because loads of surplus electronics are frequently commingled (good, recyclable, and nonrecyclable), several public policy advocates apply the term "e-waste" broadly to all surplus electronics. The United States Environmental Protection Agency (EPA) includes to discarded CRT monitors in its category of "hazardous household waste". but considers CRTs set aside for testing to be commodities if they are not discarded, speculatively accumulated, or left unprotected from weather and other damage.

Debate continues over the distinction between "commodity" and "waste" electronics definitions. Some exporters may deliberately leave difficult-to-spot obsolete or non-working equipment mixed in loads of working equipment (through ignorance, or to avoid more costly treatment processes). Protectionists may broaden the definition of "waste" electronics. The high value of the computer recycling subset of electronic waste (working and reusable laptops, computers, and components like RAM) can help pay the cost of transportation for a large number of worthless "commodities".

Presentation Links

Click here to download the power point presentation

Wikipedia Link for e-waste management Click here