Service Workers in Loklak Search

Loklak search is a web application which is built on latest web technologies and is aiming to be a progressive web application. A PWA is a web application which has a rich, reliable, fast, and engaging web experience, and web API which enables us to get these are Service Workers. This blog post describes the basics of service workers and their usage in the Loklak Search application to act as a Network Proxy to and the programmatical cache controller for static resources. What are Service Workers? In the very formal definition, Matt Gaunt describes service workers to be a script that the browser runs in the background, and help us enable all the modern web features. Most these features include intercepting network requests and caching and responding from the cache in a more programmatical way, and independent from native browser based caching. To register a service worker in the application is a really simple task, there is just one thing which should be kept in mind, that service workers need the HTTPS connection, to work, and this is the web standard made around the secure protocol. To register a service worker if ('serviceWorker' in navigator) { window.addEventListener('load', function() { navigator.serviceWorker.register('/sw.js').then(function(registration) { // Registration was successful console.log('ServiceWorker registration successful with scope: ', registration.scope); }, function(err) { // registration failed :( console.log('ServiceWorker registration failed: ', err); }); }); } This piece of javascript, if the browser supports, registers the service worker defined by sw.js. The service worker then goes through its lifecycle, and gets installed and then it takes control of the page it gets registered with. What does service workers solve in Loklak Search? In loklak search, service workers currently work as a, network proxy to work as a caching mechanism for static resources. These static resources include the all the bundled js files and images. These bundled chunks are cached in the service workers cache and are responded with from the cache when requested. The chunking of assets have an advantage in this caching strategy, as the cache misses only happen for the chunks which are modified, and the parts of the application which are unmodified are served from the cache making it possible for lesser download of assets to be served. Service workers and Angular As the loklak search is an angular application we, have used the @angular/service-worker library to implement the service workers. This is simple to integrate library and works with the, CLI, there are two steps to enable this, first is to download the Service Worker package npm install --save @angular/service-worker And the second step is to enable the service worker flag in .angular-cli.json "apps": [   {      // Other Configurations      serviceWorker: true   } ] Now when we generate the production build from the CLI, along with all the application chunks we get, The three files related to the service workers as well sw-register.bundle.js : This is a simple register script which is included in the index page to register the service worker. worker-basic.js : This is…

Continue ReadingService Workers in Loklak Search

Route Based Chunking in Loklak Search

The loklak search application running at loklak.org is growing in size as the features are being added into the application, this growth is a linear one, and traditional SPA, tend to ship all the code is required to run the application in one pass, as a single monolithic JavaScript file, along with the index.html. This approach is suitable for the application with few pages which are frequently used, and have context switching between those logical pages at a high rate and almost simultaneously as the application loads. But generally, only a fraction of code is what is accessed most frequently by most users, so as the application size grows it does not make sense to include all the code for the entire application at the first request, as there is always some code in the application, some views, are rarely accessed. The loading of such part of the application can be delayed until they are accessed in the application. The angular router provides an easy way to set up such system and is used in the latest version of loklak search. The technique which is used here is to load the content according to the route. This makes sure only the route which is viewed is loaded on the initial load, and subsequent loading is done at the runtime as and when required. Old setup for baseline Here are the compiled file sizes, of the project without the chunking the application. Now as we can see that the file sizes are huge, especially the vendor bundle, which is of 5.4M and main bundle which is about 0.5M now, these are the files which are loaded on the first load and due to their huge sizes, the first paint of the application suffers, to a great extent. These numbers will act as a baseline upon which we will measure the impact of route based chunking. Setup for route based chunking The setup for route based chunking is fairly simple, this is our routing configuration, the part of the modules which we want to lazy load are to be passed as loadChildren attribute of the route, this attribute is a string which is a path of the feature module which, and part after the hash symbol is the actual class name of the module, in that file. This setup enables the router to load that module lazily when accessed by the user. const routes: Routes = [ { path: '', pathMatch: 'full', loadChildren: './home/home.module#HomeModule', data: { preload: true } }, { path: 'about', loadChildren: './about/about.module#AboutModule' }, { path: 'contact', loadChildren: './contact/contact.module#ContactModule' }, { path: 'search', loadChildren: './feed/feed.module#FeedModule', data: { preload: true } }, { path: 'terms', loadChildren: './terms/terms.module#TermsModule' }, { path: 'wall', loadChildren: './media-wall/media-wall.module#MediaWallModule' } ]; Preloading of some routes As we can see that in two of the configurations above, there is a data attribute, on which preload: true attribute is specified. Sometimes we need to preload some part of theapplication, which we know we will access, soon enough. So angular…

Continue ReadingRoute Based Chunking in Loklak Search

Lazy Loading Images in Loklak Search

In last blog post, I discussed the basic Web API’s which helps us to create the lazy image loader component. I also discussed the structure which is used in the application, to load the images lazily. The core idea is to wrap the <img> element in a wrapper, <app-lazy-img> element. This enables us the detection of the element in the viewport and corresponding loading only if the image is present in the viewport. In this blog post, I will be discussing the implementation details about how this is achieved in Loklak search in an optimized manner. The logic for lazy loading of images in the application is divided into a Component and a corresponding Service. The reason for this splitting of logic will be explained as we discuss the core parts of the code for this feature. Detecting the Intersection with Viewport The lazy image service is a service for the lazy image component which is registered at the by the modules which intend to use this app lazy image component. The task of this service is to register the elements with the intersection observer, and, then emit an event when the element comes in the viewport, which the element can react on and then use the other methods of services to actually fetch the image. @Injectable() export class LazyImgService { private intersectionObserver: IntersectionObserver = new IntersectionObserver(this.observerCallback.bind(this), { rootMargin: '50% 50%' }); private elementSubscriberMap: Map<Element, Subscriber<boolean>> = new Map<Element, Subscriber<boolean>>(); } The service has two member attributes, one is IntersectionObserver, and the other is a Map which stores the the reference of the subscribers of this intersection observer. This reference is then later used to emit the event when the element comes in viewport. The rootMargin of the intersection observer is set to 50% this makes sure that when the element is 50% away from the viewport. The obvserve public method of the service, takes an element and pass it to intersection observer to observe, also put the element in the subscriber map. public observe(element: Element): Observable<boolean> { const observable: Observable<boolean> = new Observable<boolean>(subscriber => { this.elementSubscriberMap.set(element, subscriber); }); this.intersectionObserver.observe(element); return observable; } Then there is the observer callback, this method, as an argument receives all the objects intersecting the root of the observer, when this callback is fired, we find all the intersecting elements and emit the intersection event. Indicating that the element is nearby the viewport and this is the time to load it. private observerCallback(entries: IntersectionObserverEntry[], observer: IntersectionObserver) { entries.forEach(entry => { if (this.elementSubscriberMap.has(entry.target)) { if (entry.intersectionRatio > 0) { const subscriber = this.elementSubscriberMap.get(entry.target); subscriber.next(true); this.elementSubscriberMap.delete(entry.target); } } }); } Now, our LazyImgComponent enables us to uses this service to register its element, with the intersection observer and then reacting to it later, when the event is emitted. This method sets up the IO, to load the image, and subscribes to the event emittes by the service and eventually calls the loadImage method when the element intersects with the viewport. private setupIntersectionObserver() { this.lazyImgService .observe(this.elementRef.nativeElement) .subscribe(value =>…

Continue ReadingLazy Loading Images in Loklak Search

Lazy loading images in Loklak Search

Loklak Search delivers the media rich content to the users. Most of the media delivered to the users are in the form of images. In the earlier versions of loklak search, these images were delivered to the users imperatively, irrespective of their need. What this meant is, whether the image is required by the user or not it was delivered, consuming the bandwidth and slowing down the initial load of the app as large amount of data was required to be fetched before the page was ready. Also, the 404 errors were also not being handled, thus giving the feel of a broken UI. So we required a mechanism to control this loading process and tap into its various aspects, to handle the edge cases. This, on the whole, required few new Web Standard APIs to enable smooth working of this feature. These API’s are IntersectionObserver API Fetch API   As the details of this feature are involving and comprise of new API standards, I have divided this into two posts, one with the basics of the above mentioned API’s and the outline approach to the module and its subcomponents and the difficulties which we faced. The second post will mostly comprise of the details of the code which went into making this feature and how we tackled the corner cases in the path. Overview Our goal here was to create a component which can lazily load the images and provide UI feedback to the user in case of any load or parse error. As mentioned above the development of this feature depends on the relatively new web standards API’s so it’s important to understand the functioning of these AP’s we understand how they become the intrinsic part of our LazyImgComponent. Intersection Observer If we see history, the problem of intersection of two elements on the web in a performant way has been impossible since always, because it requires DOM polling constantly for the ClientRect the element which we want to check for intersection, as these operations run on main thread these kinds of polling has always been a source of bottlenecks in the application performance. The intersection observer API is a web standard to detect when two DOM elements intersect with each other. The intersection observer API helps us to configure a callback whenever an element called target intersects with another element (root) or viewport. To create an intersection observer is a simple task we just have to create a new instance of the observer. var observer = new IntersectionObserver(callback, options); Here the callback is the function to run whenever some observed element intersect with the element supplied to the observer. This element is configured in the options object passed to the Intersection Observer var options = { root: document.querySelector('#root'), // Defaults to viewport if null rootMargin: '0px', // The margin around root within which the callback is triggered threshold: 1.0 } The target element whose intersection is to be tested with the main element can be setup using…

Continue ReadingLazy loading images in Loklak Search