Tuesday 12 June 2012

Using AI to Combat Cyber Crime

Artificial Intelligence has, for quite some time now, been used for combating credit card fraud. Data Mining, which is in a way an application of AI, is used to detect credit card frauds by using various mechanisms. In the most general scenario, a pattern of the user's credit card usage is drawn by making use of his credit card transaction records and all the future transactions are inserted in the pattern only after conforming to this pattern itself.Whenever a transaction or a pair of transactions that violates the pattern is noticed, the system prompts the surveillance personnel to check in. Then its upon the discretion of the personnel to see if the transactions are to be investigated or to be entered in the system and inculcated in the ever changing pattern.In a more advanced form,the normal credit card usage pattern of the user is used along with a pattern of the usage seen in the credit cards of the whole group to which the user belongs. This group may be created on the basis of income,credit card category(gold,silver etc.) or even the company to which the user belongs.This scheme is more robust and more resistant against single high value transactions that may appear to drift away from the pattern but are actually genuine.

The above approaches have been quite effective in combating credit cards frauds to some extent, and as a result, agencies all over the world have started looking at AI for combating other forms of electronic/cyber crime.They sought to AI because of the fact that due to the humongous number of transactions, its utterly impossible to employ humans to track movement over the internet. They need a machine to do that and in fact they need a machine that's smart enough to match the wits of a human expert.The intelligence may either be embedded in the individual application servers, just like spam filters used by mail servers or the intelligence may be implemented at the firewalls at the gateways. The advantage with embedding it into the individual servers is that the logic related to the specific application can be embedded. E.g a traffic pattern may be acceptable if destined for mail server but not for some office application server.In fact the best approach is to divide the intelligence amongst the two places. General intelligence is embedded at the firewalls and the application specific intelligence is embedded in the individual servers.

The general model suggests that some traffic analysis technique be used. This technique would differ according to the networks. Traffic could be analyzed at one or all levels. Either only the datagarm traffic could be analyzed or the ip level traffic or both. The traffic is again matched with the general pattern of traffic just like pattern matching in credit card fraud detection. At the firewalls, the overall traffic pattern is analyzed, and at the individual servers, the application level and session level traffic is analyzed. At the application level, once again two patterns could be used - a user pattern and a group pattern. At the firewalls however, a single pattern has to be used.In fact, the system may keep different patterns for different days or different times instead of a single pattern, and these different patterns may then be used accordingly.Like every cognitive learning mechanism, these patterns would also improve with time. The system would match actual pattern with the stored pattern and also keep changing according to the patterns that it analyzes. For example, if the system reported an anomaly and the network admin thinks its normal traffic, the system would inculcate this in the traffic pattern model and would improve itself. Hence, with time, the system will become more and more effective. 

Thursday 24 May 2012

I Have A Dream

I bet you must have not just once but many a times wondered why you dream. This post will describe a theory that takes some ideas from the domain of technology. Remember, we talked about the enormous network of brain neurons in an old post. We will visit that subject again. Just to tell the novice, the brain is composed of a massive network of countless neurons and its in these very neurons where the information gets stored. Every neuron would store some part of the info and when the brain needs to retrieve the complete information, it will pass a signal through a known and structured pattern of neurons and all the info in these neurons gets combined to give the complete information.The neural network concept, which is used to store information in artificially intelligent systems, uses a similar approach to organizing information. It was only after some research on these synthetic neural networks, that a very brittle, yet intriguing idea emerged on why would a human being dream. Here's a brief explanation of the same.


A network of neurons



While we are sleeping, most of our body parts get engaged in routine maintenance tasks. The brain takes up this diagnosis and correction task and checks if all organs are functioning properly. It sends signals to these organs and they would do tasks like pumping some extra blood, disposing something in the impure blood and some other similar stuff. The brain also sends some check signals to itself. Tough Job, I must say ! The primary maintenance job it assumes is to free neurons which don't have valid information or information which may no longer be needed. But its the Brain and for its being so intelligent, you wont expect it to delete the info in these neurons straightaway. In fact, the brain checks to see if this info is useful and has to be retained.




The brain tries to see if these neurons have any useful information by sending random, yet organized, signals to them. These signals are similar to the information retrieval signals introduced in para 1 but, these signals are sent while you are wild asleep. These signals are sent across the neurons and the information in them is combined. This information gets played back while you are dreaming. The dreams are in a way a complement of the information you retrieve while you are awake. Now, these neurons had random pieces of information and they are combined randomly by some random signals. That's why the dreams we see often don't follow any specific flow of thoughts and we keep swinging from one place to another and combine people whom we met in different nations. In fact, the brain passes its own inputs to make the dreams more meaningful because it assumes that it has to supply any missing information to fill any gaps. The brain may supply commonsense knowledge that one has acquired or knowledge regarding the places you have visited or the people you know or have met at any time in your life. That's the reason why while dreaming, you end up combining several places into one and meet 10 people from 10 different houses in your own.

That's how it all happens. In cases, when the brain is unable to get an ending for the dream, it just hangs up and waits for some real life event. The real life event like ringing of a bell, or yours falling on the ground, is combined in your dream so as to get an ending. And you wonder why you were dreaming of something that actually happened.After the dream gets over, the brain deletes all the information that was saved in those neurons. However, the dream that was generated is a new information in its own. Now its upon the brain to decide on whether the dream has to be retained or deleted. That's why at times, you remember what you dream't and at times, you don't. One more thing to this is the fact that while generating some dreams, the brain spends a lot of effort as the neurons it tries to combine don't have structured information and the brain has to put in a lot of stuff to make it meaningful. That's why the brain decides to break the dream and awake you. Or if the brain succeeds in finally generating some information, it lets it stay there and won't break the dream. But the next day when you get up, you wont feel fresh as your brain had been working a lot while you were asleep just to get the dream ticking.


This was a very different theory and that's why I felt like sharing it with you. And since we've went a little off the track for this one, I will also like to thank an all time contributor to these posts - Tzinga Health Drink. I get all my ideas on posts and how to present them while I'm on my cycling spree every morning and the spree lasts for 80-100 minutes. The time I spend makes me feel refreshed and gets my ideas ticking but it also demands a lot of energy from me. So one of my pals told me to have this awesome health drink called Tzinga. This health drink comes in three irresistible flavors - Mango Strawberry,Lemon Mint and Tropical Trip and I feel that this is the only health drink which also gives a really refreshing taste to your taste buds apart from giving some serious charge to your body.When the other drinks demand you to put some clip carrier to your bicycle for holding them, you can carry Tzinga straight in your pockets and with a sipper like outlet, you can easily take a sip as an when you need it. A pack of Tzinga contains some serious energy so please be careful while using it. 

Okay, now lets get back to the topi  and have some sort of a conclusion. This theory is one amongst those 100 theories on dreaming. Some of them tell that its the subconscious mind and some tell that its some other complex mechanism related to the locomotion of the body. But whatever it may be, this one actually explains most of the phenomena related to dreams and I personally believe, this one might be the real reason.Please tell me if you agree with me or if you have some different idea or any knowledge regarding this. I'm on this thing now !!!




Tuesday 1 May 2012

Artificially Intelligent Caching

The first two paras are only for those who have very little or no knowledge on caching.

The Cache is considered to be the most expensive and the most important resource that a computer possesses. Cache is the memory which CPU can access in the least amount of time and hence, cache access takes place at a much faster rate when compared to other memory accesses.Whenever the CPU needs some data, it checks the cache first and only if the cache doesn't have the data, it begins searching for data in other memories.The size of cache is limited due to architecture constraints as well as the fact that cache is made up of highly expensive SRAMs. Now the jinx is that an access to cache takes the least amount of time, but the cache can store only some part of information that a computer holds and needs. So, Operating Systems use replacement policies for replacing existing data in cache with other data from higher level memories(RAM and secondary storage). Its this very replacement policy that decides the fate of the OS. If the replacement policy is such designed that in most of the occasions the data the CPU needed was there in the cache(a cache hit), then the overall system would be fast and performance will go up. On the other hand, if the CPU couldn't find the needed data in the cache(a cache miss), then the performance will go down.

Data enters the cache when its accessed for the first time. Suppose CPU needed some data that resides in secondary storage. First the data goes from secondary storage to RAM and then from RAM to cache. The next time CPU needs that data, it will see if the cache still has it. If the cache doesn't have it, a miss occurs and the RAM and secondary storage are searched in that order. Once again, after the data is found, it gets transferred to cache. The replacement policy comes into action after some fixed time period or after the number of misses has crossed some threshold. The replacement policy tries to predict the future and tries to remove the data from cache which is least likely to be accessed in the future, and store new data from higher memories that is most likely to be accessed in the future. Its this very future prediction capability which results in success or failure of the replacement policy. The most popular replacement policy is LRU(Least recently used) where data that was accessed least recently is removed and data that was accessed most recently is retained.

LRU is driven by heuristics(history of data usage where time of access is the primary driver) and it is obviously not perfect. Given the amount of impact the replacement policy can have on the performance, one has to strive to improve the future prediction capabilities of the replacement policy. This is where researchers believe that AI can be put to some good use. AI has been a good predictor in several domains and this domain is very likely to be another one where AI can be successful. For replacing existing data in cache with data that is more likely to be accessed, one would need heuristics(which is already put into good use by LRU) and some other predictor.

All data of a computer system is organized as a file system. Modern file systems use tree architectures to organize files. For adding up to the heuristics, the system needs to explore the most likely patterns in which data can be accessed. Its not just about the times at which data is accessed but also about the pattern in which data is accessed across the file system tree. So our proposed replacement policy would be striving to find the most probable patterns in which data might be accessed in the future. It can predict these patterns by utilizing the metadata information from the file system and by storing data access patterns and constructing new patterns by following some logic.

Basic patterns can be generated using file access times and frequencies(something which most modern file systems store). These patterns can be compared to some real file access patterns that were observed in recent history(these patterns would be stored in some separate dedicated memory).This comparison can be used to eliminate certain patterns. Finally, a probable set of file access patterns would be generated. During the policy's operation, this set of patterns is combined with heuristics(same as the ones used in LRU) and the replacement is done on the basis of the most probable data access pattern that was chosen by the replacement policy. The number of misses and hits while using the set of patterns can be used to make changes to the set of patterns itself as well as switching to the next most probable pattern from the pattern set. This will correspond to a kind of feedback which is the core of learning in AI systems.

Its pretty clear that such a scheme would need some extra memory and logic for computing these patterns and then storing and changing them. But these patterns would take up very small space and the processing can be done during certain free CPU cycles. In an overall context, the approach would be more beneficial as the cache hit ratio is very likely to increase. Such an approach is useful for general computer systems where both files and file data, are accessed in structured patterns. For example, certain applications always read some files before the others. This approach can also be put to very good use in servers(both database and web) as even in these systems, the users are very likely to view some specific data/webpage before others. However, the approach may breakdown in cases where usage is highly unpredictable.The file systems won't be needing any drastic changes but additional logic for pattern prediction and updation would be needed.

As all other AI systems, even this system will become better with operation. Initially, the system would generate file patterns on the basis of metadata that it gets from file system. As the system is put to operation, it would refine these initial file patterns on the basis of patterns that were actually observed. Finally, the patterns would be further refined after these patterns were put into operation and feedback was incorporated into the patterns. One may also think of some other prediction pattern for file usage but the core concept still remains the same- the system has to predict which files or data would be used the most at a point of time. And its pretty obvious that even this approach would be using AI to serve its purpose.That's the power of AI !


Friday 27 April 2012

3D Printing: you may have one in future

3D printing or Additive Manufacturing is a technique which is improving day by day. As the name suggests, 3D printing refers to the technique of manufacturing solid objects by supplying a digital pattern. Just like a 2D printer prints text or photos sent to it by the computer, a 3D printer reads in computer designs and manufactures the products implied by those designs. A 3D model of the object is supplied to the printer. Such 3D models are developed using Computer aided Design(CAD) tools.

A small statue of Mario generated using a 3D printer
Unlike traditional object manufacturing(pottery,shoe making,clay modeling and what not), where objects are created by taking initial building material and then removing parts from it to get the final object(in short, Subtractive modeling), in 3D modeling, objects are built by adding small quantities of the building material(Additive modeling). The 3D design of the object is broken down into cross sections of different thicknesses.This thickness depends upon the object to be created as well as the technology to be used for 3D printing.The 3D printing device takes this design and then produces every cross section as suggested by the design and creates the final object by adding the next cross section to the previous. Hence, the overall object is built by building single layers of the object and adding them to each other as they are built. These layers of the manufacturing process correspond to the cross sections specified in the design.

The aforementioned process is the core of all 3D printing technologies but the manner in which the layers are produced and how the layers are added to one another distinguishes the various technologies.A broad categorization of the techniques goes below:

Molten Polymer Deposition : Uses a molten polymer and beads of some thermoplastic material. A movable nozzle, whose movement is controlled by the Computer Aided Manufacturing(CAM) software, is used. The nozzle releases the thermoplastic beads to produce the layers. These beads harden immediately after the nozzle deposits them. The movable nozzle deposits these beads in the way the layer should be formed and then releases molten polymer to seal this layer to next layer that will be built upon it. e.g: Fused Deposition Modeling(FDM) technology.

Granular Materials Binding: Small granules of the building material are fused together by heating them. When one layer gets built up, another layer of granules is fused on to the first layer and hence no need of layer binding here. e.g : Electronic Beam Melting(EBM) used for making metal parts, CandyFab for making confectionery and inkjet 3D printing used for making colored 3D objects.

Photopolymerization : A liquid polymer that turns solid on getting exposed to some light, is used. The liquid polymer is kept in a container and is exposed to the light in a controlled manner.A build plate is used to separate a cross section of the polymer from the rest of it. The light projector's position and coverage area are controlled so that only desirable portion of the polymer over the build plate gets exposed. In this way, a layer gets formed. Then the build plate moves downwards and another layer is formed similarly. At the end, the remaining liquid polymer is released from the container and the object is acquired.e.g : Stereolithography (SLA)

A model of a sport shoe generated using 3d modeling
3D printing is used for prototyping, where a small model of actual object is made, or for creating artifacts that will actually be put into use.3D printers can be used to produce jewelery,artwork,metal models, artistic confectionery etc.  The 3D printer generates an object which is exactly similar to the 3D model supplied. Hence the printer allows one to create multiple similar copies of same design. 3D printing is perfectly complemented by 3D scanning. Hence, instead of creating 3D models of some real life object manually, one can simply use a 3D scanner to create a 3D model of  that object and once you have the model, just imagine how easy it is to build multiple copies of the same.




A bicycle model generated using 3D printing

Tuesday 24 April 2012

Can u smell what ur computer's cookin ???

Many of you might be knowing this and many of you might not be knowing this that apart from the general computer peripherals, there is a peripheral which can actually produce smell. Yes, in fact its been quite some time since such gadgets were developed. These gadgets are based on the concept that certain limited number of scents can be combined to produce other scents. These gadgets have these basic scents in form of cartridges or plain liquid containers. While operating , different quantities of these basic scents are combined in different proportion just like it happens in RGB or any other color model. And what results is an array of scents that imitate naturally available scents. Scentcom and certain other vendors are already into manufacturing different devices that exploit this mechanism. These devices are primarily used to enhance multimedia experience by adding another sense to computing.

One would wonder then that why these devices never gained any major recognition. The primary reason for that is the limited number of scents that can be produced. Although the number that can be produced is very large but only some of them are meaningful and a lot of naturally available scents may be difficult to produce by this mechanism. But the major problem is that all of this scent reproduction to imitate the reality is done manually. Someone will have to manually make a guess at what proportions might be correct to produce the kind of scent which is desired. Here is where a major scope for improvement comes in.

Currently, one has to manually generate these scents but wonder what would happen if one could actually record a scent composition and then use the composition data to reproduce the exact scent as and when needed. Just imagine what would happen then. You and your friend have this peripheral and you two are chatting. And while chatting, you can actually smell what your friend is cooking !!! 

Now this major breakthrough would come only when a more advanced scent generation mechanism is created - One that is capable of producing almost all the possible scents. Such a mechanism would concentrate on combining the core chemical compounds instead of combining scents. And the recording device will actually have to break the scent down into respective compounds and record the same for reproduction at the receiver's side. I don't have to tell you how sophisticated this future scent machine needs to be but just imagine what a revelation it would be. It will just go on another mile to make your multimedia experience lifelike.

Sunday 22 April 2012

The Future Of Wired Communication

Natural Plant Fiber, which is found abundantly almost everywhere in the world, has some unique refraction,X-Ray diffraction and a host of properties that have gained it wide acceptance as replacement in certain fields. On one hand, it is used in natural composites to add more strength to the surface soil and on the other hand, it is used in certain components of automobile engines to capture particles that can render the air polluted. Another use of it has been anticipated.

Plant fiber from a few categories of plants has optic properties that are pretty similar to those we find in Synthetic Optic Fiber made of glass. The only desirable property though which is absent in these natural Fibers, is the mechanical strength needed to work as a long distance communication medium. This property is needed as we need to splice and merge fibers as the distance increases. However, what compels us to move ahead and use these Natural fibers as replacements of the Optic fibers is the extremely low cost that such fibers would fetch. Hence, biotechnology scientists have already started working on advancing such species of plants so that the fibers become more strong. In fact, certain other fibers derived from other plants are indeed very strong but they lack the optical properties which are needed. Hence, bio-technologists are striving to make a combination of the two. It is also believed that if such fibers are made, they would perform even better than synthetic fibers because of their tissue structure which provides better refraction than synthetic fiber.

we cannot say if we'll actually see such fibers in future but the move towards it would certainly complement the philosophy of intelligent technology.