I have put together in this page some of the things that read, learn, and work on in my spare time. Basically these are the things that I like the most.
I should tell you that the order in which I present these topics does not describe which I favor over the other. I basically chose to write about the ones that I felt most comfortable writing about first.
The most important area of all the sciences is physics. All other branches of science have branched from it. Physics is the love of my life. Throughout my life I have been deeply interested in this science, and I think every one should be involved in it some how.
Although I love math a great deal, but I thought physics handled math in a real fashion. It applied it to reality. Although chemistry is a great science, but physics explained what was exactly happening in the world of chemistry (for while it was the opposite though).
Many problems in real life are very difficult to solve using conventional methods on computers, yet they pose no difficulty to us at all. Some of these problems are image and sound recognition, and control problems like balancing an inverted pendulum (broom), or backing up a truck.
Until recently all solutions to these problems were attempted using mathematical models and equations, and to be frank not all these problems can modeled. Now scientists have gone into a new direction (one that's right above our noses haha!). A direction that leads us to the most powerful computer in the world, the brain.
Artificial Neural Networks or ANN's are networks of neurons that are much like the those of the brain. This is not to say that ANN's are like the brain's NN's. They are far from that. But I am sure that with better understanding of the brain's building blocks we will be able to build better ANN's some day.
If you want to know how good ANN's are, then listen to this: An Artificial Neural Network was built and taught to read text and relate it as voice. The network, NETTalk, was monitored during the training process and the results were amazing. The network, in the early stages of the training, displayed the behavior of children learning to speak, and the more NETTalk was trained the better it performed.
I personally believe that Artificial Neural Networks will be big soon. The thinking computers depend on the advancement of this field. Someone out there will bring about a type of Neural network that will change the way we think of things. After that, people will start having conversations with objects of all kinds. You will have thinking desks, lamps, chairs, keychais, TVs, books, rocks, etc. Although, I should say that improvements on ANNs have been really slow, and the field needs serious thinkers. Any contenders out there??
I hope this information excites you about Artificial Neural Networks. If you are still interested in the topic. Look up Neural Networks at Pacific Northwest Laboratory for more information.
Genetic Algorithms are tools that help us find near optimal or optimal solutions to very difficult problems using simple operators. Optimum solution to problems like Traveling Sales Man problem, and Bin Packing problem can be found, but only using an extensive search through all possibilities. This means that the more the problem grows in size the more time is required to find the solution.
This is not to say that the Genetic Algorithms can find the optimum solution to these types of problems. At least they offer a different solution search method.
Although I like Genetic algorithm, I don't think that it is a very serious field to pursue. The reason I say this is because it is an inexact science, and that you cannot absolutely depend on it. I should say though that it is fun to play with.
For more information look at the The Art of Genetic Algorithms (GAs) home page.
My favorite language, now, is C++. After many years in programming in C, I have found out that C++ is even better. The reason for it's popularity is the object orientation style that it offers, and that's exactly why I like it. I would recommend it to everyone who is interested in computer programming, and not miss this experience.
I also have programmed in other languages. They are (including the above):
Data compression is a means by which information is reduced in size for the purpose of transfer. upon transfer, the information should be recoverable to a degree that is acceptable by humans. This is my definition of compression. There are two type of compression, lossy and lossless. Lossy compression is applied to things like images, sounds and any form of information that humans will recognize and feel comfortable with even if some information was missing. Lossess compression is applied to text, computer programs, and any form of information that does not accept loss of information. Let me just say that it's not the information that does not accept loss, it's rather us. We cannot accept loss of information because we simply defined this kind of information as nonfunctional even if it lost a single bit of information.
Compression is divided into 2 categories, transform and encoding. Transform is used to convert information to a form that has a great deal of redundancy so that it can be encoded more efficiently. Transform included methods such as Wavelets, DCT, VQ. Some encoding methods include Huffman, LZW, Arethmatic.
For more information on compression check out the compression pointers.
email me at: mohamedqasem@yahoo.com
Back to my Home Page