IETech - The First 10 Years...
10 Years - Interactive Event Technologies Inc
Written by: Shawn Berney
Thanks to the Internet Archive "Way Back Machine", I was able to pull these old articles that I had posted on my very old weblog. The links included within the blog all work - most refer to the archive to take you to the snapshots of the website at the time. The PDF documents I have reposted for your access.
In Transition
This was a transition time for me - I moved to Toronto after completing my undergraduate degree in British Columbia at Thompson Rivers University. I was inspired to start a business by the academic interest in my undergraduate work. These are the blog postings and web pages from that period. At the end of each of these three blog postings, I've put some of my current thoughts on the topic.
I've also included some snippets of the old website for interest and color. Enjoy!
Taken From: https://web.archive.org/web/20140102132358/http://raguiding.net/
PHOTO: Shawn Berney exploring the Bruce Peninsula
Evolution of computer hardware - the serial vs parallel debate.
December 19th, 2009 · Comments Off
For many years, advances in the speed of computer hardware have been enabled by making tasks perform in a serial manner. Just think of the consumer space where the Universal Serial Bus (USB) and Serial ATA hard drives (SATA) have emerged as the new standard for connectivity.
But the debate between serial and parallel communication methods has a long history. In the early days computer processors were limiting the speed at which tasks could be performed. To solve this issue, single tasks where split into many parallel tasks and distributed to multiple computer processors. In recent years, the need to distribute tasks among many different processors has been largely unnecessary due to the speed advancements in computer processors. 20 years ago personal computers where limited to 8 Megahertz (Mhz), todays processors are capable of handling 3,600 Mhz easily.
In fact, the use of parallel processing techniques is currently being used in highly demanding gaming environments where large amounts of visual data requires processing in real time. Advancements found in graphic processing cards (referred to as Graphical Processing Units or GPUs) provide a glimpse into the future of computer design methods and technology that will allow individuals to leverage the huge speed increases in data transfer.
Fiber optic technology is one such technology that enables huge increases in data transfer rates. As society moves to greater personal connectivity using fiber optic technology, we will be unable to match the transportation of data with processing speed. Once again we will be forced to consider the use of parallel processing techniques.
In the next blog posting I will examine several initiatives that allow consumers to leverage the research and development in both serial and parallel technologies to create enriching user experiences.
But the debate between serial and parallel communication methods has a long history. In the early days computer processors were limiting the speed at which tasks could be performed. To solve this issue, single tasks where split into many parallel tasks and distributed to multiple computer processors. In recent years, the need to distribute tasks among many different processors has been largely unnecessary due to the speed advancements in computer processors. 20 years ago personal computers where limited to 8 Megahertz (Mhz), todays processors are capable of handling 3,600 Mhz easily.
In fact, the use of parallel processing techniques is currently being used in highly demanding gaming environments where large amounts of visual data requires processing in real time. Advancements found in graphic processing cards (referred to as Graphical Processing Units or GPUs) provide a glimpse into the future of computer design methods and technology that will allow individuals to leverage the huge speed increases in data transfer.
Fiber optic technology is one such technology that enables huge increases in data transfer rates. As society moves to greater personal connectivity using fiber optic technology, we will be unable to match the transportation of data with processing speed. Once again we will be forced to consider the use of parallel processing techniques.
In the next blog posting I will examine several initiatives that allow consumers to leverage the research and development in both serial and parallel technologies to create enriching user experiences.
Comments Off
Tags: The Daily Sys. Admin
COMMENTS - 2019-07-31 [Shawn Berney]: I remember being fascinated with GPUs ability to process general math. On the first date with my wife I remember us drinking wine at the Bier Markt in 2008 and talking about "XML cards" - wouldn't it be great if we could utilize PCI cards to process hierarchical data! Below is an image of my company website that describes how I was applying XML for metadata around image and web content. Now, 10 Years later, my wife has been accepted into the Ryerson PhD program for Computer Science...
Taken from: https://web.archive.org/web/20131020103310/http://www.raguiding.net/BDDC.php
Web 2.0: Internet regulation and personal learning
November 16th, 2009 · Comments Off
It is interesting to view regulation as a form of control on the internet. Not only government legislation, but regulation by the use of computer code…
“My position on Net Neutrality, which is consistent with my position on everything else in the book is that it is a very bad idea to regulate ahead of the problem…[:15]… and I think regulating ahead of the problem, very dangerous - once again - when the technology is evolving as quickly as it is, and the regulatory process, and undoing mistakes made in the regulatory process, is as slow as it is.” Larry Downes (author, Laws of Disruption) on Hearsay Culture podcast Show #101 |- - MP3 File [0:40:35 - 0:41:24]
Regulation, when not informed by government legislation, is often based upon an abstract definition of ‘best-practice’ - meaning how Web 2.0 should be undertaken. This definition of best-practice is taken by developers and imposed when coded by computer programmers. In this way, computer code is the rule of law on the internet.
If we decide not to undertake legislative reform how do we ensure that the internet evolves in a manner that reflects our collective social values? We are fortunate in Canada to have strong advocates for social and legal reform. Individuals such as Michael Geist and David Asper have supported individual rights in this constitutional debate (see attached).
But we must not rely on outside advocates, philanthropic or otherwise. Instead, we must work to understand the barriers technologies impose and make choices on internet usage that reflect our values and long term interests. This discussion on the barriers internet applications impose on individuals is now taken hold in the academic world. Professors are seeking to understand how our environment affects our learning and seeking software applications based the criteria they find important.
In an effort to explore these ideas further, I am working with a highly experienced enterprise architect - Peter Rawsthorne - to create a chapter that discuss how the Internet is the Platform, and how educators can harness the internet to enhance personal learning.
Thoughts??
Email me :: job.4.shawn [at] raguiding.net
File::PDF, 484 KB - David Asper - Speech to University of Toronto - Donation of $7.5 M
Shawn Berney
“My position on Net Neutrality, which is consistent with my position on everything else in the book is that it is a very bad idea to regulate ahead of the problem…[:15]… and I think regulating ahead of the problem, very dangerous - once again - when the technology is evolving as quickly as it is, and the regulatory process, and undoing mistakes made in the regulatory process, is as slow as it is.” Larry Downes (author, Laws of Disruption) on Hearsay Culture podcast Show #101 |- - MP3 File [0:40:35 - 0:41:24]
Regulation, when not informed by government legislation, is often based upon an abstract definition of ‘best-practice’ - meaning how Web 2.0 should be undertaken. This definition of best-practice is taken by developers and imposed when coded by computer programmers. In this way, computer code is the rule of law on the internet.
If we decide not to undertake legislative reform how do we ensure that the internet evolves in a manner that reflects our collective social values? We are fortunate in Canada to have strong advocates for social and legal reform. Individuals such as Michael Geist and David Asper have supported individual rights in this constitutional debate (see attached).
But we must not rely on outside advocates, philanthropic or otherwise. Instead, we must work to understand the barriers technologies impose and make choices on internet usage that reflect our values and long term interests. This discussion on the barriers internet applications impose on individuals is now taken hold in the academic world. Professors are seeking to understand how our environment affects our learning and seeking software applications based the criteria they find important.
In an effort to explore these ideas further, I am working with a highly experienced enterprise architect - Peter Rawsthorne - to create a chapter that discuss how the Internet is the Platform, and how educators can harness the internet to enhance personal learning.
Thoughts??
Email me :: job.4.shawn [at] raguiding.net
File::PDF, 484 KB - David Asper - Speech to University of Toronto - Donation of $7.5 M
Shawn Berney
Comments OffTags: Executive · The Daily Sys. Admin
COMMENTS - 2019-07-31 [Shawn Berney]: Today the technology world is attempting to understand how the proliferation of Blockchain currencies such as Bitcoin should be regulated. I think this question around regulation is as appropriate - perhaps even more so - than it was 10 years ago...
Image: Website as viewed from the Lynx browser (originally captured March 2007)
Technical Developments…
October 24th, 2009 · Comments Off
So, in an effort to setup my online portfolio, I realized that my current server configuration needed to be upgraded…
My server currently hosting my online content is a Mac xServe running the OSX Tiger Server operating system. This operating system is now getting on in years and does not include the current software one might expect to run in a web server. One of the advantages of the Intel based macintosh computers, however, is the ability to upgrade components that run ‘under-the-hood’ in Unix. So this is what I did…
As the tutorials suggested, I first made a bootable image of my hard drive for backup. This would allow me to simply plug-in my backup hard drive and revert to the system settings of the live server before upgrades had begun. For this task, I used the well respected BRU utility.
Following the instructions provided by topicdesk.com provided no major surprises in the installation and in around 2 hours had my system setup to run the GD Graphic and XML libraries. I had also managed to upgrade my PHP version from 4.3 to the more robust and Object Oriented PHP5 server.
The next task was to install the new virtual hosts. Apple makes this task very easy, and since I installed the PHP5 library from the Command line, the previous installation tasks did not completely break the ability to manage my server configuration from the Mac Server Admin tool. This meant that I could simply add Sites to my current ‘Web’ management profile. I decided to track visits to each site individually, so I also altered the name of the access_log files used by apache and changed the format to ‘combinedv’ (used by AWStats to gather information from web visitors).
Next, I installed and configured the free AWStats web analytical software. This software can be use to collect information from web visitors and provide statistics on most visited sections, pages that most people left from, etc… This program was installed to my localhost directory for providing information through the browser, but not to the general public.
With the foundation now created, I can begin to migrate my existing portfolio of web content to this new environment and track to see who is interested in the development applications and content being hosted. Over the next week, I will be working to present my portfolio using simple HTML, CSS and Javascript technologies on the IETech.Net domain. Stay Tuned for updates!!
My server currently hosting my online content is a Mac xServe running the OSX Tiger Server operating system. This operating system is now getting on in years and does not include the current software one might expect to run in a web server. One of the advantages of the Intel based macintosh computers, however, is the ability to upgrade components that run ‘under-the-hood’ in Unix. So this is what I did…
- Downloaded and installed the GD2 libraries used by PHP4.3 and PHP5 to manipulate images on the server
- Downloaded and installed the PHP5 source code
- Added 3 virtual domains to the server using server admin - each with a unique log file
- Installed and configured AWStats for web analytics
As the tutorials suggested, I first made a bootable image of my hard drive for backup. This would allow me to simply plug-in my backup hard drive and revert to the system settings of the live server before upgrades had begun. For this task, I used the well respected BRU utility.
Following the instructions provided by topicdesk.com provided no major surprises in the installation and in around 2 hours had my system setup to run the GD Graphic and XML libraries. I had also managed to upgrade my PHP version from 4.3 to the more robust and Object Oriented PHP5 server.
The next task was to install the new virtual hosts. Apple makes this task very easy, and since I installed the PHP5 library from the Command line, the previous installation tasks did not completely break the ability to manage my server configuration from the Mac Server Admin tool. This meant that I could simply add Sites to my current ‘Web’ management profile. I decided to track visits to each site individually, so I also altered the name of the access_log files used by apache and changed the format to ‘combinedv’ (used by AWStats to gather information from web visitors).
Next, I installed and configured the free AWStats web analytical software. This software can be use to collect information from web visitors and provide statistics on most visited sections, pages that most people left from, etc… This program was installed to my localhost directory for providing information through the browser, but not to the general public.
With the foundation now created, I can begin to migrate my existing portfolio of web content to this new environment and track to see who is interested in the development applications and content being hosted. Over the next week, I will be working to present my portfolio using simple HTML, CSS and Javascript technologies on the IETech.Net domain. Stay Tuned for updates!!
Comments Off
Tags: The Daily Sys. Admin
COMMENTS - 2019-07-31 [Shawn Berney]: It's hard to believe that my first website contract was almost 15 years ago. While travelling in Patagonia region of Chile, I met a fellow who ran and adventure guiding company who was looking for a new website. Built using PHP, HTM and CSS - I built a website that included fully interchangeable themes (for those who wanted higher contrast associated with a black on white page for ease of reading). The website hosting this page iteself was used as part of my IETech.Net brand - you can see the archive from 2011 here:
Comments
Post a Comment