ABSTRACT

High-speed Internet seamless access is the expectation of recent technology trends. While many of the technologies like High Speed Internet Access (HSPA), Wireless Interoperability for Microwave Access (WiMAX) & Long Term evolution (LTE) are promising and meeting the expectations appropriately, ‘Digital Divide’ still exists when penetrating to the rural areas in a seamless and the cost effective way.

The solution to the above situation is having a way of channeling the broadband internet on the electricity supply so that networking is carried out on power mains. Distribution of internet data on the power lines is called as HomePlug or Broadband over Power lines (BPL).

Electric Broadband!, is an innovation in the recent technology trends. This technology is certainly encouraging and infrastructure cost effective model to offer broadband at high speed internet access – having penetration even into the rural areas since every home in the world is served by power lines.

INTRODUCTION

Realizing how the Communications landscape is changing rapidly since the inception of Internet, Broadband Internet, as known to everyone, is a data transmission mechanism over high bandwidth channels through cables or over the air. Wireline broadband is called Asymmetric Digital Subscriber Line (ADSL) and Wireless Broadband technologies emerging are Mobile WiMAX and Advanced LTE. However, all these technologies require much infrastructure costs to cater the needs of the general public. Hence they are mostly limited to the urban areas and the digital divide is prevailing still by internet not reaching to the masses even at rural geographies.

WHAT IS ELECTRIC BROADBAND?

On the contrary to the technology barriers, new innovative technology called ‘Electric Broadband’ is on the way to reach even the rural areas with NO much infrastructure costs to carry the Internet data over relatively medium/high frequency electric signals. Usually Broadband uses low-frequency electric signals to carry ordinary phone calls and higher-frequency signals to carry Internet data as we see in the ADSL technology. Electronic filters separate the two kinds of signal, with the low frequencies going to your telephone and the higher frequencies to your Internet modem. The principle behind Electric Broadband technology is fairly simple – because electricity routes over just the low-frequency portions of power lines, data packets can be streamed over higher frequencies.

HOW ELECTRIC BROADBAND WORKS?

Key technical concept for the data transmission of the Electric Broadband technology is devised on the fundamental concepts of Radio Frequency (RF) energy bundled on the same line that carries electric current. Since the RF and electricity vibrate on different frequencies, there will be no interference between the two and also the packets transmitted over RF are not lost due to the electrical current. Electric Broadband system consumes only a part of the complete power grid. Usually electricity power generating plants carries to transmit power to substations which then distribute the current using high-voltage transmission lines of 155K to 765K volts and these are not relevant for packet or RF transmission. Solution for the Electric Broadband technology is to bypass the substations and high-voltage wires and concentrate on the medium-voltage transmission lines which typically carries around 7,200 volts and then the transformers convert the electrical current to 240 volts – where the electrical current supplied to the households. Putting in simpler words, standard fiber optic lines are specifically designed for Internet transmissions are going to be used to carry data. These fiber optic lines will be connected to medium-voltage lines. Repeaters are installed at these junction points to repeat the data and boost the strength of the transmission. Couplers or specialized devices are also going to be installed at the transformers to provide a data link around these. After that, the digital data will be carried down the 240-volt line that connects to the residential or office building’s electrical outlets which become the final distribution point for the data.

At this juncture, the residents and the enterprises have two options for Internet connectivity. They can get wireless transmitters that will wirelessly receive the signal and send the data on to computer stations or they can get Broadband over Power Lines modems for data filtering -the Electric Broadband will screen out power line noise and let only data through – then send the data onwards to the stations. The wireless transmitter or the Electric Broadband modem can transmit the signal to end-users or computer stations wirelessly (which may require WLAN-capable devices) or through wires (which require computers connected to the data transmitter or Electric Broadband over modem Ethernet cables.

TECHNOLOGY BENEFITS & BUSINESS CASE

Electricity being the widely spread across the global landscape including the rural areas, electric broadband is going to be a penetrating technology to reach the rural areas and breaking the digital divide in the communication space.

Many benefits can be foreseen by the deployment of this technology. It is affordable because, it uses existing electrical wiring and outlets to avoid expensive data cabling pulls-save up to 75% of the infrastructure spend. It is very convenient for the end-users since every electric outlet in every room becomes Internet-enabled. Very easy to use as no software is necessary, simply “plug and play.” Technology is reliable unlike wireless solutions that suffer from hit-and-miss service coverage and moreover provides the solutions for universal coverage operating a data transmission speeds of up to 6 million bits per second connectivity.

One of the best business cases will be – Power Grid Management Solution which will become very effective after realizing this Electric Broadband technology. Utilities are able to manage their systems better by having the data streamed to them on the power lines. Because this has such a benefit relating directly to the management of electricity there remains a high likelihood of electric utilities investing more money into Electric Broadband. Being able to monitor the electricity grid over the power grid network will create a virtual workforce with many less man hours needed.

TECHNOLOGY CHALLENGES

While this technology has many advantages, there are some challenges as well. RF Interference is the most serious challenge that this technology is currently impacted with. It is facing opposition from ham operators (Amateur Radio) and the Federal Emergency Management Administration (FEMA) who are concerned that Electric Broadband technology will reduce the number of radio frequencies available for ham and short-wave radio operators and that RF transmission over unshielded medium-voltage lines will cause interference with already-assigned frequencies. One another challenge is the considerable delays happening in the technology standards ratification. Transmission standards for Electric Broadband technology is emerging and yet to see draft versions released. This is further hampering efforts to have the technology adapted by more Service Providers.

CONCLUSION

On a final note, Electric Broadband is at least 2 years away from now. However, from the Google research in vendors involved, Electric Broadband is already happening to the tune of about $10 million annually. Since the technology serves a much larger audience than any of its competing technologies. With that kind of potential, it should be able to sustain a growth rate of two to three times that of either cable or telephone companies.

In today’s world, the workplace has been transformed. Computer technology is present to one degree or another in virtually every job or profession. To prepare students adequately for the workplace we must recognize that integrating computer technology into the classroom is essential. To execute this integration properly, careful planning must precede implementation. We must be prepared to explore different means of implementation inasmuch as there is no perfect system or a “one size fits all” software program. Each institution must decide to what degree they will implement technology and how quickly they will do so. It is also important to appeal to educational leaders for support as well as gathering preferences from both teachers and students.

In his article, “Investing in Digital Resources” David McArthur explored the notion that the decision regarding whether or not to use technology as an educational medium has already been made. What must be done is plan carefully to ensure that the long-range goals of technology integration are properly served.

The leaders in higher education must “plan for and invest in e-learning.” (McArthur, 2004, p3) E-learning has become an accepted method of education just as the “Web” has been accepted in business and at home. Integrating the newer technologies to supplement existing learning has become imperative. When planning is performed correctly, the educational environment should be able to use technologies to increase teacher/student communication, enhance faculty morale by use of an “on-line resource center,” (McArthur, 2004, p2) use web-based programs to enhance recruitment, and better prepare students for the workplace.

There are potential problems that must be overcome when planning for technological integration. First, the technological options are myriad and only a few will be appropriate for a given school or college. Second, while many institutions become accustomed to the idea of augmenting their educational system via e-learning, it can be troublesome and radical.

Some key issues in the potential success in the adoption of e-learning can include (but is not limited to) the school or college’s present computer network capacity, the willingness of the school’s leaders to support change, current or probable resources, the potential accessibility of the e-learning services by the students.

In looking at a comprehensive long-range plan, there are a number of options available. One is “Staged Implementation.” (McArthur, 2004, p4) While the critical planning should be virtually complete, not all components of the final plan need be in place at the outset. A planned multi-year plan of implementation can be used. Not only does this allow for the development of resources, it is possible to troubleshoot elements as each stage progresses. Another is “Appropriate Outsourcing.” (McArthur, 2004, p4) Not every educational institution has the in-house resources (personnel, tools, equipment) to implement even a staged plan. Outsourcing can be both cost and time saving. While it may be difficult to convince some leaders of the potential advantage in outsourcing, especially since this type of expertise “is regarded as an educational core asset” (McArthur, 2004, p6), drawing comparisons to the business world may help to demonstrate the benefits.

In his article, “Herding Elephants: Coping with the Technological Revolution in our Schools” Scott Tunison addressed the issues of: 1. the extents to which schools need to visit computer technology and 2. The tactics used to make the most of the potential advantages and diminish the potential pitfalls in the integration of the technology.

His reference regarding “Herding Elephants” is allegorical to managing the coming technology and learning to “integrate it into the educational framework” or moving aside and letting the “technological revolution” pass by. (Tunison, 2004, p7) Either way, educational technology is not to be ignored and it cannot be allowed to manage itself.

Fundamentally speaking, much of education is unchanged from long past. The methods that have been used were for the most part appropriate for the subject at hand. A perception might be that, if the concepts to be learned have not changed then a change in teaching method is not necessary. However, even if some of the concepts have not changed, the application context as well as the learners’ context has. While computers have entered the educational environment they often have been simple substitutes for other tools that already exist and are in place; tools such as blackboards, books, etc. What this means is that the process of learning remains unchanged when new uses for the available technology are not fully utilized.

Educational reform is necessary if we are going to meet the needs of our students. If our culture has developed electronic media, animation, etc. then that is the context through which we must reach our students.

The changes that must be made can make some educators uneasy. The learning paradigm must shift from the teacher as dispenser of knowledge to the student as active learner. Tunison cites Fullan (2001) in an identification of “three broad phases to the change process.” The phases are identified as “initiation, implementation, and institutionalization”

Initiation involves some entity proposing directional change. Sometimes students ask for change and sometimes groups of teachers, administrators, and parents form committees to begin a planning process for technological integration.

Institutionalization includes the perception of importance. One might say this is the stage of “damage control.” Clear policies, well trained teachers and administrators, and a supportive school board are crucial in this stage. It is important in this stage to record relevant data regarding the program for analysis. What was well planned and conceived may still have “bugs” to work out. The analysis of the data can assist in the “tweaking” of the program.

Educators must be aware of the importance of technology in the educational environment and be prepared to integrate it. Technology is extensive in our contemporary culture and reaching our students must involve meeting their needs in the world they know. We may, in fact, reach more students and perhaps stem the tide of dropouts.

In her article, “What Students Want to Learn About Computers” Judith O’Donnell Dooling, has informed the reader that students, parents, and administrators have specific preferences with regard to computer technology.

Over time, the importance of computers and related technology has been realized. However, while spending for computers has risen, some schools have not been as successful in identifying specific computer skills and its power as a tool of learning and teaching.

Student responses were varied. Many reported that they began learning about computers at an early age, usually from a more experienced person. Some students, especially in grades four through seven thought learning independently was the most enjoyable.

Interestingly, students of both genders reported that they had a reasonable confidence in their computer abilities, but some differences in perception were evident. To a degree girls, but primarily boys, thought that computers were too technical for girls.

The experience students had prior to school, the teacher, and computer access had a significant effect on student computer learning. Even if they, at home, had seen the computer more as a toy, they began to see them more as a tool in the school setting. They recognized the importance and power of the computer as their exposure increased.

Perhaps unlike other subjects students learn in school, students exchanged computer tips, recommended hardware and software, and generally discussed the subject of computers during their lunchtime and recess.

The students also saw the importance of computer knowledge as it related to its use in the workplace after their school experiences. They observed that, no matter where you work, you would be using computers to some degree.

The teachers expressed the concern that not all shared the same proficiency. Many mentioned that often the students knew more than the teacher did. Teachers also observed that, though the students had a great deal of computer knowledge, it was often limited to games and software. Another observation was that computer curriculums vary greatly school to school.

Teachers expressed that computer knowledge needs to be relevant. That is, it needs to be applied across the curriculum and used as an integral tool of learning. All agreed that the role of teacher needs redefinition and adequate professional development provided to facilitate the needed change.

In conclusion, we have seen that computer technology in the educational setting is essential for learning in contemporary society. Selecting, planning, and implementing must be done with great care to avoid waste and potential incompatibility with the goals of the educational institution. School leaders must be convinced that paradigm shift is not an option; that teachers and students must assume new roles, and their support for new ideas is essential.

We must also be able to meet students where they are. Our culture has created systems of technology to which students are accustomed. To continue teaching in an antiquated fashion does our students a disservice, especially if we are to prepare them for the workforce following their education. We must also be aware of teacher and student preferences if we are to expect them to fully utilize the new resources.

As an information technology specialist myself I find it constantly frustrating how I’m mislead or not informed by vendors and retailers about buying decisions I might make. There are lots of PC buying guides available out there but they are either too specific about technology choices so they date very fast or do not help you meet your specific requirements. They are often too high level and only explain the very simplest of specification details and the minute a sales rep or consultant gives you other options or explanations you are lost. This guide is aimed at the novice to moderately experienced PC user.  If you are a guru or expert you should know most of this already.

As an example of how easy it is to be mislead a very well-known big leading PC brand was recently advertising its ‘xyz-wizbang’ PC with an amazing 12GB of memory, Extreme Intel Quad core processor and Quad graphics cards. Sounds impressive huh!? When I saw the low price I became suspicious. When you click on the link for more details, then click on the options, then click on the technical specification, then read it very carefully and you find it only has 3GB of memory but is expandable to 12GB, has a standard Intel processor but has an option for the Extreme, and supports Quad graphics cards but comes with just one. You can imagine without digging into the detail the price would have been quite seductive.

A favourite proverb of mine goes something like ‘Give a hungry man a fish and feed him for a day, give him the tools to fish and feed him for life’. Well that about perfectly summarises the intention of this guide. Given just a little more information you can adequately specify your own requirements, cross-examine vendors and retailers about their advertised machine specifications and reward yourself with a good quality PC that will last and do all that you want it to. The added bonus to learning how to buy this way is that it wont date, the same concepts as I explain here have applied broadly since the mid 1980’s. A lot of the understanding lies in demystifying the jargon and I will do a lot of that using simple terms. Clearly more understanding is needed as I still get asked from time to time ‘What is the difference between 4GB RAM and 300GB of hard disk, and which do I need?’. Hmmmm, if you are in this category you need to read this now….   The components of the PC Before we can make decisions we need to know what everything in the PC does and how it does it.

  • The CPU or Processor – The processor is the engine of your PC it executes instructions millions of times a second to get the work you want done finished. Modern processors will have multiple cores and are known as Dual core (2 cores) or Quad core (4 cores, soon ‘Octa’ 8 core processors will be available) which makes them a bit like my wife i.e. capable of doing more than one thing at a time, or multi-task.So lets say I ask my computer to give me a million lottery numbers and it takes eight seconds to complete (it would actually finish in the blink of an eye). With a Dual core this would only take four seconds as I could get one core to give me half a million numbers and the other to do the same, at the same time. So on a Quad core using the same logic it would only take two seconds. Breaking up tasks like this is called multi-threading. So that’s the theory if you can break up a big task into multiple smaller tasks that can all be executed simultaneously then the more cores the better. However there’s a catch. Not all tasks can be broken up this way and not all software vendors write their programs this way so you need to make sure what you do is able to take advantage of it then you will know whether you should go for a Duo, a Quad or even an Octa core processor. The other factor that affects performance is the clock speed of the CPU expressed in GHz (cycles per second). Most processors these days are somewhere between 1.8GHz and 3.3GHz. All cores in a multi core CPU will be running at the same clock speed. So if you see a manufacturer describe a PC as 12GHz, then what they are probably doing is multiplying the clock speed by the number of cores (4 cores by 3GHz). Perhaps to make their PC’s look phenomenally faster than anyone else’s, who knows. Clock speed is simpler than number of cores, a faster clock speed simply means faster execution times, period. Therefore if you can’t get the benefit of more cores you should be able to get the benefit of higher GHz.
  • The Memory (or RAM) – While the computer is on the memory is where the CPU stores its work in progress. Computer memory is the fastest place the PC can store information so when its doing your work that is where it prefers to do it. However if it runs out of available memory it will start storing things on your hard disk instead (known as Paging to Virtual memory) and this is when things slow down dramatically. So make sure you don’t skimp on memory get more than enough of it, as much as you can afford.  Secondary to that is how fast it is in itself. As a guide Windows Vista really eats the first 1GB so your minimum memory ought to be 2GB (DDR2) or 3GB (DDR3) for general light use, and 4GB (for DDR2) to 6GB (for DDR3) or more for demanding games or applications.Memory speed is measured in a combination of MHz, Type and Latency. Its also important to remember that bandwidth is different to speed. Imagine the memory bus is like a road. A single lane road with cars travelling fast at say 70mph, each car will get to its destination quickly but there will only be so many cars you can fit on the road. On the other hand a four-lane highway even if it’s slower at 55mph will get more cars to their destination in the same time period although all the cars individually will take longer to get there, this is akin to bandwidth.  None-the-less the two are interlinked as clearly a narrow road can match the bandwidth of a multi-lane highway if the cars are able to move fast enough. Different demanding tasks you might do on a computer demand bandwidth or speed, or a balance of both to work optimally.  The memory bus technology type used also influences bandwidth i.e. DDR, DDR2, DDR3 etc. The DDR means Dual Data Rate and the number after it indicates how many parallel channels it uses to communicate. Clearly the more channels it uses the more bandwidth it will have. Therefore DDR2 has twice the bandwidth of DDR and DDR3 in turn 150% more bandwidth.  As a rough rule of thumb memory speed in MHz should double for each level of DDR as each has a latency penalty roughly double that of its predecessor. So to get DDR2 memory as fast as DDR 400MHz, the DDR2 needs to be 800MHz, and because its dual channel you will get greatly increased bandwidth.  Think about this carefully because DDR3 1333MHz is not automatically better than DDR2 1100MHz for the reasons explained, its is often assumed the latest technology is better and it isn’t always the case. At the time of writing you ought to expect to be getting a new PC with DDR3 1333MHz to 1600MHz memory.  Or if it has DDR2 then 800MHz or more. As far as latency is concerned it gets complicated to explain but if you are doing demanding work make sure its low latency memory. For gaming and general work speed is more important than bandwidth for video encoding or other tasks that move a lot of data volume around bandwidth is the priority.
  • The Hard disk (HDD or Storage) – The hard disk unlike memory is a slower but permanent store.  When your computer is switched off all your files remain there ready for when you switch it back on again.  For the vast majority of people the only thing of relevance is whether the store is big enough.  Mechanical hard disks are so cheap now you can get an awful lot more capacity than your ever likely to need for not a lot of money. Typically a new PC should have at least 300GB of storage capacity.If you do a lot of photography, database or video work then the speed of the disk is important to you. Four things affect hard disk speed – 1) Its rotational speed (usually 7200 rpm, but up to 15000rpm), 2) How much data it stores per square of its surface (the platter), i.e. areal density, 3) the interface speed to the PC, it should now be SATA-II which can transfer data at up to 300MB/s, and 4) the speed at which the drive head can move across the surface (average seek time usually around 8ms). The latter is usually the least important. A new technology that is quickly maturing is the Solid State Disk or SSD.  It has no moving parts and uses a special type of permanent memory (flash memory like USB sticks have) to store the computers data just like a hard drive. It’s a complex topic in itself and a specialised area.  For the vast majority of people they are too expensive at present to offer much value and cheap SSD’s will only outperform a good hard drive in limited scenarios.
  • The Graphics Card (or GPU, Graphics Processing Unit) – If you don’t play games, edit video, do 3D graphical modelling, CAD or design work you can skip this section as any modern graphics card should do. That includes photographers as photographic work is still chiefly constrained by the CPU and not the graphics card. The motivation for investing a lot of technology in graphics cards has been the demands of 3D graphics processing, in real-time. This is so demanding there is now arguable more processing power in the GPU of high end cards than in the main processor of the PC.There are essentially two contenders in the field ATI and nVidia.  Both are excellent and offer very similar performance in terms of price-value.  They keep swapping the crown between each other as to who is the ultimate fastest at any one time.  Generally unless you need the best of the best the second or third card down from the top of the range should do all you need and will be considerably cheaper. Graphics cards have their own dedicated processor at their heart known as the GPU.  The speed of this GPU is measured just like you main processor i.e. in terms of cores (streams) and MHz.   More cores and higher MHz generally make it faster. There are architectural differences between the design of the ATI and nVidia GPU’s so you cannot reliably compare them core for core, MHz to MHz.  You need to look within the same vendor to do that kind of comparison. Almost all cards now support dual DVI (digital) monitor outputs so you can have two monitors attached simultaneously. The other important thing that varies is the screen resolution they support you should expect a minimum of 1600×1200, the higher you go the more memory you will need on the card (as its acts as a frame buffer) and the more powerful the card will need to be to quickly render the larger screen size. Multiple graphics cards can now be installed using ATI Crossfire or nVidia SLI inter-GPU communications standards and interfaces. It’s also possible to have two GPU’s on a single card in which case SLI or Crossfire will be running to link the two GPU’s on the one card.  Beware of this method of increasing your graphics performance as some applications and games are not designed to use it effectively and it doesn’t scale up linearly each card (or GPU) is likely to give you an additional 40-60% performance gain over the single card (or GPU).

  • Windows 32 or 64-bit (or the Operating System) – this is all about memory.  Computers use the binary system of 1’s and 0’s to express numbers and the number of ones and zeros they use determines how big a number the computer can use. Each memory location in the computer is referenced by a sequential number just like a street address for the postal service.  Having 32-bits enabled the computer to address up to 4GB of memory, you can add more but the computer just wont see it and that’s no longer enough. So now the standard address is 64-bit and that means we can reference 2^64 memory locations, or, 17.2 billion gigabytes (or 16 exabytes)!  So it should be a while before we need to change that again. Though most of the latest motherboards are accepting up to six sticks of RAM which has densities of up to 4GB per stick, so 24GB is the practical limit.
  • Optical Drives (DVD, CD and Blu-ray) – Optical drives come in three flavours DVD Rewriter (DVD-RW), DVD/Blu-ray Reader (BD-R / DVD-R) and Blu-ray Rewriter (BD-RW). They really speak for themselves Blu-ray is superior to DVD in two ways 1) it uses Blue laser light which records at a much higher density than a red DVD laser so they have a greater capacity at 50GB over a DVD at typically 8.5GB, 2) you can play Blu-ray movies on a Blu-ray Reader or Rewriter. If you don’t care about either of those features you don’t need Blu-ray and can save yourself some money. Generally only film aficionados or video editors make use of Blu-ray.
  • Interfaces (connections to your PC for peripherals and accessories) – All modern PC’s should have interfaces supporting the following standards; Firewire (IEEE1394), USB2, eSATA and HD Multichannel Audio. So expect this as a minimum.
  • The Case – for your average PC it really doesn’t matter what you choose. However, if you think you are likely to want to upgrade regularly then choose a standard size and construction and not a branded case. Main brand cases are often deliberately designed to be throw away as they have no upgrade room, non-standard sizes or are very difficult to work on and remove outdated components. Also make sure it has good cooling, front, rear and top fans ideally, and is quiet. If you can afford it choose a aluminium tool-less case rather than pressed steel. They generally look better and are far more easily upgradable. Also make sure that some of the PC interfaces have sockets on the chassis i.e. USB2, Audio, Firewire (IEEE1394) and eSATA.There are a lot of more advanced factors that affect performance that I will cover very briefly as a detailed explanation is beyond this article. Please refer to my other articles for more information where I go into all of these in some depth (some justify having a whole article dedicated to them).
  • Front Side Bus, HyperTransport (AMD) or Quick Path Interconnect (FSB or with Intel Core i7 QPI) – this is a communication channel (a ‘bus’) between memory and the processor. Over the last decade it has steadily increased from 10’s of MHz to the last processor supporting it (the Core 2) hitting the technologies ceiling at officially 1600MHz (though with overclocking faster speeds where possible). QPI speeds are currently 4800 to 6400GT/s.
  • Overclocked – With the right competent and experienced vendor they are quite simply the fastest PC’s you can buy and usually represent very good value.
  • SpeedStep (or EIST) – sounds sexy, but really isn’t. It’s a technology that cuts down power consumption by the processor when utilisation is low.
  • HyperThreading – allows the processor to create the illusion to Windows that it has more processors than it really does. So a Quad core HyperThreaded processor would look like it had eight cores. Its not quite as good as it sounds though as all the processor is doing is allowing the extra virtual cores to use bits of the processor that aren’t busy. So if you are doing heavy work its highly like there aren’t any bits of the processor that aren’t busy and it wont be that useful at all. In general use it gives about a 10-15% performance boost. With heavy processor loads it actually gets in the way and can drop performance by 5% or so.
  • Turbo – its creeping back into use again with Intel’s new Core i7 processors and it sort of works the opposite way around to SpeedStep. It simply means if the processor is in high demand then the PC will boost its performance by raising the processors clock speed 200-300MHz. Its usefulness is well overstated.
  • RAID – is a disk controller technology that can both speed up disk transfer times and offer resilience should a drive fail.  Comes in different flavours RAID0 for performance, RAID1 for resilience and RAID5 or RAID10 for both resilience and speed.

Mystics and Telepathic Communications with Unseen and Unknown Person(s) via Advanced Technology

God speaks through all of us, leading us steadily albeit slowly toward her. How we interpret the data we receive from God is a product of the level of knowledge and understanding gained from our many lives. The level to which we hear God is also a matter of evolution. No matter what a mystic says or if the status quo “believes” them to be good or evil, if they are hearing, they need to be listened to.

A mystic is a primitive person, like we all are who attempts to explain advanced technology to a primitive society. Mystics in antiquity, even up until present, made grand assumptions about who they were communicating with and more importantly, what was being communicated.

Someone who has a true mystical experience is passionate in their desire to them, to do for mankind. This passion has led many a mystic to be ridiculed, rejected, imprisoned and has often, led to a torturous death.

The foundation of all religion is grounded in mysticism. As mysticism evolves, knowledge of the mystical evolves as well. As the technology of contemporary science evolves, so does the mystics ability to understand what is being said to them, who is communicating with them or to them and more importantly, why the communication is taking place.

It doesn’t matter how knowledgeable one believes themselves to be, we are a society that lives in the dark. We live a world of beliefs! The bottom line of our situation is this; we live on a rock out in space and we have no clue how we got here. We have many theories but the secret of our existence is still a mystery.

As we and technology both evolve, the likelihood of discovering the roots of our creation become statistically more plausible. When one pledges faith to an ancient religious creed, they need to consider those implications carefully. Pledging faith to a creed which has no basis in logic says something about ourselves and we need to pay attention to what that something is.

It is the mystic who influences the path of beliefs, who presents something new and in so doing, changes the path of faith for the status quo. Ironically, the mystic and the few followers who assist in the delivery of this new message are often rejected by society only to be accepted after society has destroyed them. The status quo is never anxious to be enlightened, unless it is they who are doing the enlightenment. However, once an impassioned mystic is disposed of or fades into myth and their philosophy has been altered to accommodate the wiles of the status quo, society is eager to accept them but only within the specified boundaries of those who seek control. A good example is the manipulation of the philosophy of Christianity, which lived within the hearts of the faithful for more than three hundred years before Constantine the Great collected many of the ancient writings of the ancient Christian philosophers and evangelists to include that of Jesus, the Nazarene. The philosophy of Christianity goes a great deal further back than two thousand years ago in antiquity. Christianity is basically the same philosophy as the Roman pagans and paganism in general and dates back long before the time of Christ.

Our beliefs whether we are aware of it or not, are always evolving. What happened two thousand years ago is not a matter of history but logic. With the evolution of language, the motivation, politics, knowledge and understanding of the writer are all variables in what we believe. I believe we would all feel foolish when and if we were to be presented with the evidence of what really happened two thousand years ago. One day, because of the advanced technology that records everything we think, say and do… we will!

If one seeks to understand the mystic, thus the roots of all religions from the beginning of time, they must at least accept the premise, that advanced technology already exists and those that possess it want us to have it as well.

A mystic is a person who claims to explain and explore the mystical experience which includes; OBE’s, UFO sighting, encounter with an alien, ghost, angel, demon, long dead relative or is in communication with any number of unseen beings, deceased persons or witnessed any variety of other worldly event. Those who study the art of the mystical are few and the skepticism of the majority thwarts endeavors by the impassioned mystic to enlighten. To be fair to the majority, this process is complicated even further by fraud and over zealousness by those who desperately desire attention and further complicated, by the true mystic who, more often than not, is unable to explain in any reasonable format of their experiences.

A great mystic once said that, “The road to paradise was narrow and few would find it!” Most major religions have similar variations of these great words. Ironically, the faithful, which includes most of the population of the world, those who have pledged their eternal futures to this creed follow the path of the majority, the safe more secure path of least resistance. This is self-deceit which manifests itself into social psychosis. The rejection of the mystic represents societies desire to protect itself from seeing itself for what it truly is…. primitive, unknowledgeable and illogical!

The foundation of all beliefs is fear! When one “believes”, then the path or search for truth ends for them. When the search for truth ends before it becomes knowledge, our evolution is slowed. When we say to God, “we believe” what we are really telling God is that we have stopped our search…. this is illogical! If we haven’t answered the mystery of our existence, then the path of discovery needs to continue. When one claims victory when in reality, there is no victory, the implications and result for this are self-deceit and internal conflict which results in depression for the individual and psychosis for society as a whole.

What about the mystics who have led their true believers in the wrong direction? What are the common themes of the mystic? Don’t forget, that all religions began as cults and the members were run down and often imprisoned and even executed! All religions began as cults, cults that were rejected by the status quo!

Mystics attempt to interpret what they are hearing in the only way they can; through the use of rituals and symbols. These rituals and symbols are often a final result of something that is misinterpreted. The mystic gets a piece of the puzzle and then try to construct the entire puzzle from these misinterpretations. Some common themes are; end of times disasters, marriage with multiple partners, indiscriminate sex with members often with the mystic but not with the other male members of the group, celibacy and other forms of self-denial. There is separation of the mystic and the other members of the cult. There are strict rituals in some areas and liberal practices in others. Many mystics claim to be a Messiah or the second coming of Christ and others seek to serve another Messiah, more than likely, one who existed in antiquity.

One of the greatest self-deceits of all time is the belief that the greatest wisdom is found in the past when logic would tell us that the greatest logic can only be found in the present. Society likes to keep wisdom in a place where the truth can never be found. Society fears the truth! As long as the truth is kept in the past, then the status quo can make truth what they want it to be. However, if we allow logic to be our guide, we would have to make individual changes; we would have to swim against the current of the status quo, we would have to follow the path of greatest resistance and the majority of people in the world will not accept this as truth. Society wants truth to be that which is the easiest to accomplish!

Many mystics claim to be either a Messiah or more specifically, the second coming of Christ. The reason the true mystic would say this is, because this is what they are being told by those who are speaking to them. The problem with communicating with beings capable of communicating telepathically is that they don’t always tell the truth and we as humans, tend to believe what we want to believe. Logic is the key to surviving this initiation of immortal life which beckons us to choose the path of good over evil because this is what it is!

The road to God, our Creator, is a road of discovery. As we discover more about God, the level of those who surround us changes. We are a society both here on the earth and out there, a world separated by knowledge.

Mystics are passionate! As you can imagine, anyone who is getting direct information from an unseen being or beings, deals with this phenomena in one of several ways. They question what they are hearing; they do blindly what they are being told or they try to make the communication stop with drugs or alcohol and sometimes a combination of all three. Often the curiosity level of the mystic outweighs their fear of exploring the unknown. Many people explore mysticism in a number of ways from Ouija boards, to channeling or the many other forms of divination. As a practical matter, those who are on the other end of this communication attempt to scare away the mystical explorer. If they are doing their jobs correctly, they will put the fear of God into those who try this with any commitment. There is a reason for this. They, out there, are seeking to “initiate” the true explorer. They want you to do this but if they can scare you off, you’re not the one they need to do the job that is so badly needed out there. Everyone is called to explore the unknown and few actually do it. This says something about the state of our world today. Eventually everyone will do this! However, “when” we do this, is a choice which must be made by the individual!

When we leave the earth, the search for God continues “out there, solamenta” and it follows the same path as does our path here on the earth. When we leave the earth, we find ourselves in a telepathic world and therefore, communicating with thought is the norm. However, you are told to be cautious when talking to unseen entities just as we are on the earth.

Being telepathic in a telepathic world is the same as communicating in a verbal world. You get used to it very easily and when we first start doing it, it becomes very normal, very quickly. However, it is one thing to converse with someone who is standing in front of us or someone who is on the phone and yet another, to converse telepathically with an unseen and unknown person. Even “out there” people seek higher wisdom.

Out there, some believe that the voice that speaks to them from the unseen world is a higher wisdom just like many mystics here on the earth believe about their guiding voice. Even out there solamenta, conversing with an unseen voice is seen in many ways. Some are warned against this form of communication just like they are on the earth and others encourage it. Out there Solamenta, it is much more common to see someone out doing their business, talking to an unseen entity much more so than it is on the earth.

Either here or out there, the fear of castigation makes some to cover up their association with whomever or whatever it is that is communicating with them. What is this voice that is responsible for developing both saint and serial killer? This voice that becomes all things and takes us in whatever direction we want to go, often to personal destruction and inner conflict? This voice is a very advanced computer, both God and Satan. It whispers in both our left ear and our right and it can take you to sainthood or to destruction. It will allow you to see yourself for who you truly are inside, so that you can make the necessary changes in yourself. Changes that inevitably lead us to eternal community… Paradise or to another earthly life! However, we don’t join this elite eternal community until we have proven that we will not follow the path of least resistance; greed and self-indulgence! Life is an initiation into immortality, to enlightenment if we choose well

If you’re going to take a risk, take the risk in the direction of righteousness, the direction of logic! Think about what you are doing and the path you are following. Ignorance from lack of exploring is no excuse!

An idle mind is the devils workshop, this is true but a mind which contemplates God may appear to be idle but it is a mind that is doing exactly as God is telling them to do. It is best that we, as individuals within society, aren’t too quick to judge.

The mystic is a person who has spent a great deal of time looking inside and attempts to rationalize truth. Look at the mystic as a professional explorer of the inner realm, notwithstanding that the job doesn’t pay well most of the time. However, no one does this without the promise of treasure laid up in heaven if not on the earth as well. How many of us are willing to make the same commitment? How many of us are willing to postpone the possibility of success, love and happiness on the earth for the chance of eternal success love and happiness eternally? How many of us are willing to make that gamble? If the answer to the question is a heartfelt yes, you have become a candidate for the next evolutionary step… to become an inner explorer… a mystic!

Everyone is called but few are chosen. Everyone is called and everyone would be chosen because out there, if your guy or gal on the earth becomes a mystic, it is a huge deal! Out there even the bad mystics are more famous than popular entertainers are here on the earth. This is why mystics often refer to themselves as Messiah or Savior, the second coming of Jesus, etc. The Messiah complex occurs because the guides of the mystic are telling them this. This message has some truth to it but it is also often, seriously misunderstood by some mystics. The great mystic is a Messiah or Messenger and to their guides, they are a Savior! All mystics, unless they turn evil, attract a great deal of attention. No matter if you are here on the earth or out there solamenta, managing a person that does something that very few do… become such a novelty that the answers to our eternal quest for humanity are present in anything that attracts so many people. The mystic gives their guides this great opportunity. Out there, they too turn to the mystic on the earth for answers. God favors the mystics on the earth or at least, in the minds of those out there, solamenta.

Divine beings are those who have developed the greatest technology using the mechanism of the universe (the creation of the “first being”) as their guide that work through those on the earth. This is how the chain reaction which results in universal community begins. It is through technology that those on the earth begin to understand how the universe was created, that takes us from theory to knowledge.

We must first understand the creation if we are to understand the Creator and thus, the Creator’s eternal plan. If earthlings are ever to develop these technologies, it must be done through those primitive other worldly explorers who venture into the realm of other dimensional communication… the Mystic!