Monday, March 13, 2023

Revisiting Web 3.0 and Beyond

Ill just give a quick run down on the history of the web first. Web 1.0 was the first basic visual based website display. You had things like blogs and generic information sites. You could download porn images at 5pixels a minute and in general you went to sites for reading an article or having a conversation on a forum. Then web 2.0 gave us an interactive internet, better visuals and actual application ability. As the web grew so did the speed of internet providers to the home. In web 2.0 you would see Youtube, eBay and Paypal spring up with the ability to entertain and provide commerce through the web.

I wrote an article a while ago when web 3.0 was starting to be a buzz word (https://highendlogic.blogspot.com/2016/03/web-30-and-beyond.html). In the article I detailed my thoughts on what web 3.0 would actually mean. My conclusion is that the next phase of the web was to make the internet do physical tasks. My thoughts were that this seemed like the only place that the web could grow because we already had the ability to make purchases and play games. I didnt see any other thing I would need the web to do digitally.

If web 2.0 disrupted the local shop web 3.0 ended up distrupting the need for a car. With web 3.0 you have uber and doordash make your requirement for a car obselete for any kind of in city transporation and food delivery. This is not only a great product that adds convenience for those that will pay the extra money but it actually created a entire new economy, the "Gig economy." That seems to be the key to me when we discuss defining different eras of the web. There was a real impact to peoples daily lives when they didnt have to ride the bus and instead could uber from any location. The impact on many people that we able to make an extra buck by delivering food with no friction of going through a hiring process, having to deal with a manager and the requirements to be on the job at a particular time. It was very freeing for everyone to have these types of apps available. Once covid hit many people strted to work from home. Zoom became a major player in streaming video. This further lowered the need to own a car but just enforced more that the web was not just about interacting with data remotely but instead you become part of the app yourself. Web 3.0 is an effort to decentralize civilization.

Web 4.0 will allow remote access to tasks that require more physical requirements. The remote work trend will be taken a step further. I see a time when physical tasks like truck driving or being a fastfood cashier will be done remotely and the employee will have access to mechnical structures that perform the labor on site. The mechanics will be the obsticle to overcome for this to occur but just like the Gig economy was new benefit of web 3.0 this remote work automation includes its own. This could mean major savings for companies that have insurance policies due to employees being on site. Also relistate space can be optimized requireing less space for the same task. This remote access to automated systems could also create a higher level of consistency. The military is already taken this into account with drone technology. Ill give a prediction of 2040 it really all depends on the physical machines being built but I dont expect the software to be too much of a problem.

Tuesday, October 22, 2019

Versioning and Abstraction Inefficiency

Certain recipes have been around for centuries, passed down within a family or to a greater extent carried on by an entire nation. If you are Italian you will have a spaghetti recipe if you are from the south you probably have a bar-b-q recipe. I have little hope of anything happening like this in software engineering. What we have today is only a bunch of ingredients handed down in the form of the version of the language we use but the combination of these foundational apis to make a consistent morsel of logic for our applications to consume is constantly changing. For an industry that has a core focus on abstraction we seem to only abstract as much as will last 1 - 2 years, then a new api must be created and we must dub that old way useless. Your grandmas spaghetti recipe is garbage , this hipsters recipe made with JSON sauce is the new hotness. Beyond popularity and the cloning of tools that already work perfectly good there is a worse problem. Improper deprecation, bloated tool sets and confusing documentation plague framework. This can make picking up new tools feel more cumbersome than catalyzing. The creation of a framework should be at such a high level of abstraction that upgrading and learning it should be trivial, but it often isnt.

Recently I have adopted a number of Laravel framework projects. A 4.3 version which is one release old. I take a look at the docs seems like a standard php framework uses a bunch of symphony libraries. I try to set it up on a vagrant box and I am bombarded with a bunch of setup errors. In setting up the application I get my first laravel error about a folder not being present. For some reason a "views" folder is not generated in the template caching system and has to be manually created. Second I get a problem with an environment variable that is not in the config folder but is in a bootstrap.php file. The second problem would have been easy to find if I knew what I was looking for but google searches were showing me the Laravel 5.0 env setup. I finally see the version 4 way of setting up the environment and the question comes to mind why did this need to be changed? It didnt, but this change was made to make a better style but not create a better feature. These kind of changes should be should be at the top of documentation. For the most part they are useless and just make picking up a framework more difficult than need be.

Finally I get to building features and fixing bugs. I come upon a queue feature and start getting this strange bug where a queue job will drop off and stop sometimes and not other times. I use a command line script provided by Laravel to restart the queue and that seems to work but I cant find out why it stops. I eventually find out that its the length that the script runs with certain data and that this bypasses the error catching (I guess a timeout isnt an error in this case). I had to extend the timeout length for the SQS job in amazon. All these issues came up and I had a feeling that this system was too complex, it wants to make things easy but bites off more than it can chew and in the end becomes a burden to consume. I see why there is a website dedicated to "laracasts" (video documentation) but that shouldnt be necessary. If you need an entire domain dedicated to training thats a sign that your framework is not easy to use.

The difficulty to picking up a framework is one thing but bigger worry is to completely break implementations between versions. If a framework api has a feature then why is it necessary to not only change the implementation but also change the api? If the interface was bad before it should be changed in that version also. Its little things like that add difficulty to using a framework and add no useful functionality. At the very least a script or clear simple process should be made available to make upgrading easy. Isnt this something we as software engineers should be good at? Isnt this a problem we shouldnt have? The first thing a framework provides is an abstraction if that cant be done correctly how am I cant have faith in the underlying implementation.

Another task I needed to do was get php5.6 working. I had to add it to an install script on a fedora vm because thats what we use on our production server. This led to long set of steps because dnf (fedora's commandline update tool) did not allow me to download that specific version. I ended up having to research a specific provider (thanks remi) add this repo to my dnf instance and use it to download the correct version. This and my experience with frameworks (not just laravel, angular butchered it backwards compatibility so bad that it dropped the js from its name) that makes me uneasy about the future of our industry. When people order spaghetti they know what they are getting, we should be working for this with our software tools. I think this actually should be the most important part of open source software. Not providing new feature libraries, but creating abstractions that stand the test of time.