Converting your Angular CLI application into a NPM Module: Part Two

In the second part of our NPM & Angular CLI series we discuss how we turned our internal application into an npm package and some of the challenges we experienced.

Intro

In my last post we reviewed the process of converting some pieces of your Angular CLI application into a package hostable on NPM.

In this post we will discuss a real world example in which I converted one of our own Angular CLI applications into an npm package (@erdiko/ngx-user-admin) and some of the lessons we learned while doing so.

I had some challenges from local development setup, component templates and publishing my package to a namespaced npm organization.

Hopefully we can help you avoid these small pitfalls so you can focus on creating a fantastic package!

Local development & testing with npm link

To start to move code from the Angular CLI application into my new package I had to set my environment up so the Angular CLI application would reference my local code. Honestly, finding an easy way to do this became a giant pain but I think we have found an easy method. I’ll note two methods I used and the side effects of each.

npm i [directory]

The first method I used was installing the local package with each build by providing the full path to my local directory.

Here’s an example of installing the package into your destination project (assuming local package location /home/andy/ngx-user-admin and your destination package is in /home/andy/user-admin):

cd /home/andy/user-admin/app/themes/user-admin
npm i /home/andy/ngx-user-admin

One downside to this method is that you must re-install using this command after each build of your local package. Every. Time.

Clearly this is not ideal and also a productivity time suck.

npm link

The second, and preferred method, is using the npm link command to create a reference to your local code. This method ‘tricks’ npm into installing and referencing your local code instead of the npm registry code.

This method also uses a ‘live’ reference, similar to a symlink, that makes sure your destination or parent package use the most up to date code after you build.

Here’s an example installing the package into your destination project (assuming local package location /home/andy/ngx-user-admin and your destination package is in /home/andy/user-admin):

cd /home/andy/ngx-user-admin
npm link
cd /home/andy/user-admin/app/themes/user-admin
npm link @erdiko/ngx-user-admin

Please note that your linked packages will still be linked after deleting your local node_modules directories. This might cause confusion with later development and testing so be sure to delete all link references when you are done developing with this command:

npm uninstall -g ngx-user-admin

Templates and Bundling

Another difficult challenge I ran into was including an HTML template as a separate file for the components.

When creating a component in an application you can easily reference your HTML templates by a simple path and the compiler includes the files automatically. All of this kind of goes out the window when you attempt to use the AOT Compiler / RollupJS to create a UMD package. Your components now have no idea where the HTML file path is relative to.

I have seen some documentation about using a module ID to reference your module path canonically, but this seems to have been deprecated I eventually settled on reading the HTML into a variable. This method avoided the pitfalls of file paths and I could easily control the HTML I imported.

To achieve this I moved all of the component template files into files with an extension of “.tpl.ts” to make sure they were compiled, and I moved all the HTML into an exported variable to return the code.

Let’s take a closer look at the Home component template file ‘home/home.component.tpl.ts‘:

export const tpl: string= `
<div class="row">
<div class="col-xs-12">
<h1 id="welcome-title">Erdiko User Admin</h1>
</div>
</div>
...

Here is where we actually import and include this template as a variable in `home/home.component.ts`

import { tpl } from './home.component.tpl';
...
@Component({
selector: 'app-home',
template: tpl
})
export class HomeComponent implements OnInit {
...

I also used rollup-plugin-angular plugin to validate & minify the inline HTML I import. This allows us to check the HTML before each build as well as remove some whitespace to make this package a little smaller.

From my rollup.config.js file:

plugins: [
angular({
preprocessors: {
template: template => minifyHtml(template, htmlminOpts)
}
}),

While originally VERY frustrating, I think this solution could work in a variety of packages even if you do not choose to bundle your packages up with rollup.

Publishing your package with an organizational namespace

npm’s registry is a great and free tool that allows you to publish your packages for open source software, and they even offer methods to host packages for an organization and even some to prevent public access.

After some research we found it is possible to publish your package under your organization without a paid account, despite the lack of documentation on how to do this.

When publishing your application, use the “–access public” flag.

Here is how we initially published this package:

npm publish --access public

Note: you will only need this flag on your initial publish. Subsequent updates and publishes will not require this flag.

Please note that you must add the organizational namespace to your package.json file before you attempt to publish like this:

{
"name": "@erdiko/ngx-user-admin",
"version": "x.x.x",
...
}

Documentation and Bug Tracking

Once you publish your package it’s your responsibility to provide good documentation for other developer and end-user documentation.

We plan on creating some end user documentation (coming soon!) using mkdocs since we like it’s themes and it’s ease of use. We also plan including some custom user documentation as we find it’s quite easy to generate attractive HTML from markdown files.

We will use a project called compodocs to create some information about the code itself. This project combs through your typescript code and generates documentation from docblocks. It also creates a handy report noting your comment to code coverage ratio.

It’s also VERY important to provide some channels for your end users to ask questions and report bugs. Since this is an open source project hosted on Github, our general plan is to use the Github Issue tracker to allow user’s to ask questions and report bugs. We believe this is the most straightforward method to interface with our users and to communicate with our development team.

Conclusion

Creating a package can present some significant non-coding challenges and I’m hoping we were able to save you some grief. We’re looking forward to a stable release of this package very soon, and we’re also excited about creating more packages in the future since we now have a major one under our belt.

If you have any questions about this process or about this package, feel free to reach out in the comments or on our github repo!

Converting your Angular CLI application into a NPM Module: Part One

Creating a custom application with Angular CLI is easy and fun. Converting your application into re-usable code is often the next logical step but can be quite a confusing process. We will discuss the process of converting your code into something you can actually use in multiple projects and distribute for others to use.

Intro

Angular CLI is a great tool to start an angular application easily and quickly. The latest version brings some great tools to the workflow like the module bundler WebPack.

Creating components and services quickly is a strong pro for the Angular CLI too, but at some point, you can only get so far with a single application. For example, you might find that you need to split some of the components into a discrete package for distribution.

Splitting your components and services into another package also has additional benefit of allowing you to consume and extend your code.

NPM

For those unfamiliar, NPM (Node Package Manager) is a JavaScript package registry that allows for easy distribution. It’s a great way to find packages and one stop place for information on updates and support. There are thousands of packages currently published and you should be very excited to add yours!

A package is defined by its config file, package.json which provides some of its meta data, the basic helper scripts and dependencies you need to run and compile your application. There is no doubt you have seen these files before. Angular CLI creates a basic package config when it creates your application and we will detail some of the steps to edit this file for distribution.

How to decide what to move from your app to a package

While every piece of code you write is likely a brilliant gem that should be printed on resume stock and handed out as examples of Great Code, likely only some of it is custom code deserves to be distributed to the masses.

Here are some questions you should ask yourself before deciding if a component or service is worthy of its own module:

  • Could this be a simple building block to solving a larger issue?
  • Do you see some of your angular code as being part of a solution to a larger issue?
  • Is this component something you or someone else will re-use on another project?
  • Does this solve a common problem you encounter?
  • Have you written code like this before and find yourself writing it repeatedly?
  • Have you written code like this before and find yourself writing it over and over again?
  • Are these collections of components unique enough that they are almost their own project?
  • Did you find the point of this application revolved around a few choice components?

Your module’s directory structure

The easiest way to explain how to create a new npm package is to show you a ‘complete’ package. Let’s look at an example below:

 ├── dist
 │   ├── bundles
 │   │   ├── ngx-foo-bar.umd.js
 │   │   └── ngx-foo-bar.umd.min.js
 │   ├── index.d.ts
 │   ├── index.js
 │   ├── index.js.map
 │   ├── index.metadata.json
 │   ├── node_modules
 │   ├── package.json
 │   └── src
 │       ├── foo-bar.module.d.ts
 │       ├── foo-bar.module.js
 │       ├── foo-bar.module.js.map
 │       ├── foo-bar.module.metadata.json
 │       ├── user.model.d.ts
 │       ├── user.model.js
 │       ├── user.model.js.map
 │       ├── user.model.metadata.json
 │       ├── user.service.d.ts
 │       ├── user.service.js
 │       ├── user.service.js.map
 │       └── user.service.metadata.json
 ├── index.html
 ├── index.ngsummary.json
 ├── index.ts
 ├── karma-test-shim.js
 ├── karma.conf.js
 ├── node_modules
 │   ├── @angular
 │   ├── @types
 │   ├── abbrev
 │   ├── ansi-align
 │   └── zone.js
 ├── package.json
 ├── rollup.config.js
 ├── src
 │   ├── foo-bar.module.ngfactory.ts
 │   ├── foo-bar.module.ngsummary.json
 │   ├── foo-bar.module.ts
 │   ├── user.model.ngsummary.json
 │   ├── user.model.ts
 │   ├── user.service.ngsummary.json
 │   └── user.service.ts
 ├── systemjs.config.js
 └── tsconfig.json

Two major directories to note in this example are: `src` and `dist`

The `src` or source directory is where you edit files and do your work. It also has its own `package.json` file that we will discuss more in depth later.

The `dist` or distribution directory is the compilation destination directory and is where we store the files we publish to npmjs. As I mentioned before we go more in-depth into this post.

Angular AOT Compiling

If you are reading this article you are most likely also aware of AOT (Ahead Of Time compilation).

For those of you who are familiar, this differs from “old school” angular compilation by doing a lot of the work that was normally done in the browser at the compilation time. Formerly this was all done by the JIT compiler where the browser would put together all the code. Now, this can all be handled by an offloaded backend process that “tree shakes” the code and trims off the fat.

Tree shaking this result in a much smaller codebase … translates to less data to send to the user … meaning faster load times!

This, of course, is a very brief introduction to the concept of AOT. For more information please review the official docs.

Ignoring some files in your Repo

There’s a lot of files that are created in the AOT build process you are not normally used to seeing in your previous JS/TypeScript projects. Make sure you do not include these in your repo or your npm package by adding these patterns to your .gitignore file:

*.metadata.json
*.map.js
*.js

Also, create a new type of ignore file called .npmignore.

*.ngFactory.ts
*.ts

Setting up your package config files for compilation and distribution

You should create two package.json files: one for development, and another for distribution. The development package file will allow you to edit and run your code for easy debugging. The production package file will allow you to distribute your smaller and compiled module to the masses.

Development package.json

Let’s take a look at an example development package.json file:

{
  "name": "my-package",
  "version": "1.0.0",
  "description": "An amazing module for Angular.",
  "scripts": {
    "transpile": "ngc -p tsconfig.json",
    "package": "rollup -c",
    "minify": "uglifyjs dist/bundles/my-package.umd.js --screw-ie8 --compress --mangle --comments --output dist/bundles/my-package.umd.min.js",
    "build": "npm run transpile && npm run package && npm run minify"
  },
  "types": "./index.d.ts",
  "repository": {
    "type": "git",
    "url": "git+https://github.com/example/my-package.git"
  },
  "author": "Foo Bar",
  "license": "ISC",
  "bugs": {
    "url": "https://github.com/example/my-package/issues"
  },
  "homepage": "https://github.com/example/my-package#readme",
  "devDependencies": {
    "@angular/common": "~2.4.0",
    "@angular/compiler": "^2.4.10",
    ...
    "uglify-js": "^2.8.22",
    "webdriver-manager": "10.2.10",
    "zone.js": "^0.7.4"
  }
}

You will see the expected key values. Some are meta information about the package, etc.

There are some major differences in this file compared to the production package.json file: devDependencies & scripts.

NPM Package Scripts, as you know, are used to execute build & test commands and we need to install some devDependencies to allow you to work on your code.

Now let’s look at your production file…

Production package.json

You should create a production package.json file for your end users. This file should live in your `dist` directory as it’s really on. Here is an example production package.json file:

{
  "name": "my-package",
  "version": "1.0.0",
  "description": "An amazing module for Angular.",
  "main": "bundles/my-package.umd.js",
  "module": "index.js",
  "typings": "index.d.ts",
  "keywords": [
    "angular 2",
  ],
  "repository": {
    "type": "git",
    "url": "git+https://github.com/example/my-package.git"
  },
  "author": "Foo Bar",
  "license": "ISC",
  "bugs": {
    "url": "https://github.com/example/my-package/issues"
  },
  "homepage": "https://github.com/example/my-package#readme",
  "peerDependencies": {
    "@angular/core": "^2.4.0 || ^4.0.0",
    "rxjs": "^5.0.1"
  }
}

Note this file is significantly smaller than the development version. You do not need to include the devDepencies and you only point to your compiled module.

Demo-ing you Project

It’s a great idea to let end users see your package in action before they download and install it. I like to include a barebones application that implements and demonstrates my component.

This is a good way to promote your project and allows you to prove your code works.

Conclusion

We have reviewed the process of taking your code from an Angular CLI project into a fully-fledged npmjs package for others to use. We hope you can use this advice and apply it to your own projects.

We’ll explore how we ‘took apart’ one our internal Angular CLI applications and turned large portions of it into an npm package in my next post.

Thanks for reading!

oAuth – Part 2

A practical example of oAuth integration into your system for authentication process.

Hello readers, today we will walk through a practical example about how to implement a simple login using the oAuth concepts described
in the previous post. To accomplish this goal we going to to create a new Erdiko project with three additional package that will help us to interact with Facebook API. Those packages are Erdiko-Authenticate and oauth2-client  and oauth2-facebook from The League of Extraordinary Packages. But before we can start coding we need to create a developer account and register our app.

How to register your app

This is the easy step; let’s demonstrate how to register using Facebook.
The first step is login in the developers site and sign in with a valid Facebook user. The next step is set you as Developer. To do this go to account, click on the button and follow instructions. Easy, right?
And the last step is to complete and submit this form. That’s it! Now you have registered your app to use Facebook API.

Prepare host project for the example

In this step we just need to create a new Erdiko project, for more details please refer to Erdiko

composer create erdiko/erdiko oauth_authenticate_sample

The next step is add the additional libraries.

In both cases, inside the project directory, will add new required packages

composer require erdiko/authenticate
composer require league/oauth2-client
composer require league/oauth2-facebook

setup erdiko-authenticate

Once it’s added we will need to add the config file in app/config/default/authenticate.json this is an example:

{
  "authentication": {
    "available_types": [{
      "name": "mock",
      "namespace": "erdiko_authenticate_Services",
      "classname": "Mock",
      "enabled": true
    }]
  },
  "storage": {
    "selected": "session",
    "storage_types": [{
      "name": "session",
      "namespace": "erdiko_authenticate_Services",
      "classname": "SessionStorage",
      "enabled": true
    }]
  }
}

we will revisit this file in future sections.

SETUP oAuth2 Client (from thephpleague)

This package doesn’t need a special configuration, but let’s take a look at constructor parameters we will user in our extension.

$provider = new \League\OAuth2\Client\Provider\GenericProvider([
    'clientId'                => 'demoapp',    
    'clientSecret'            => 'demopass',   
    'redirectUri'             => 'http://example.com/yourredirecturl/',
    'urlAuthorize'            => 'http://example.com/oauth2/authorize',
    'urlAccessToken'          => 'http://example.com/oauth2/token',
    'urlResourceOwnerDetails' => 'http://example.com/oauth2/resource'
]);

clientId and clientSecret, come from the dashboard of your registered app, in the picture below are App ID and App Secret.

redirectUri, is a public URL in your server where you will be redirected after oAuth server finish the authentication process (Facebook in our example)

The last three keys (urlAuthorize, urlAccessToken and urlResouceOwnerDetails) will change based on which service we are using. In the above block, we assume that the the oAuth server is placed in example.com, maybe our custom implementation.

Create new Service class

We will implements AuthenticationInterface that way we can inject it in our authenticate package thru autenticate.json config file we added previously.

In authentication section you will have to add a new entry in available_types array:

{
      "name": "oauth",
      "namespace": "app_services",
      "classname": "oAuthAuthenticator",
      "enabled": true
}

In the constructor method of this class we will create a provider instance for the client, but we will use the third package (oauth2-facebook) to simplify things:

namespace app\services;


use erdiko\authenticate\AuthenticationInterface;
use \League\OAuth2\Client\Provider\Facebook;

class oAuthAuthenticator  implements AuthenticationInterface
{
    protected $provider;

    public function __construct()
    {
        session_start();

        $this->provider = new Facebook([
            'clientId'          => '1925853064368417',
            'clientSecret'      => '55ea4c1a56ca6d1db485b954ZZZ',
            'redirectUri'       => 'https://example.com/login/setlogin',
            'graphApiVersion'   => 'v2.8',
        ]);
    }

    public function login($credentials)
    {
        if(!$this->checkLoggedUser()) {
           $authorizationUrl = $this->provider->getAuthorizationUrl();
           $_SESSION['oauth2state'] = $this->provider->getState();
           header('Location: ' . $authorizationUrl);
           return;
        }
    }

    protected function checkLoggedUser()
    {
       return (
            isset($_SESSION['code']) 
            && isset($_SESSION['oauth2state'])
       );
    }

    public function verify($credentials)
    {
       return true;
    }
}

 

As you can see in the login method, I’ve added a simplistic check for a logged user, if there’s no record in session will redirect to login popup. Now we’re delegating the authentication process to Facebook, and the result of this process will be finally handled by method getSetLogin in the login controller.

Use new service in a Login Controller

In this example we are using a Login Controller just to keep authentication logic isolated. This is a normal erdiko controller, where we will have three basic actions: getLogin – that will load a basic -authenticator and invoke the login method. This action will be loaded when you click on Login with Facebook buttons that you can find everywhere these days.

public function getLogin()
{
   $authenticator = new BasicAuthenticator(new User);
   
   $authenticator->login(array(),'oauth');
}

getSetLogin – a callback method – is responsible to manage Facebook response and storage user in SESSION.

public function getSetLogin()
{
    $authenticator = new BasicAuthenticator(new User());
    if (empty($_GET['state']) || ($_GET['state'] !== $_SESSION['oauth2state'])) {

        unset($_SESSION['oauth2state']);
        throw new \Exception('Invalid state.');
    }
    $token = $this->provider->getAccessToken('authorization_code', [
        'code' => $_GET['code']
    ]);
    try{
        $user = $this->provider->getResourceOwner($token);
        $authenticator->persistUser($user);
    } catch (\Exception $e) {
        throw new \Exception($e->getMessage());
    }
}

Finally, we have the getLogout action, that will destroy the SESSION and user won’t be longer authenticated.

public function getLogout()
{
   $authenticator = new BasicAuthenticator(new User());
   $authenticator->logout();
   \erdiko\core\helpers\FlashMessages::set("Good bye, ".$authenticator->currentUser()->getUsername(), "success");
   $this->getLogin();
}

Conclusion

It’s pretty easy to add oAuth features on your system as you can see. Delegating the authentication process to a trusted server, providing users with the ability to use existing trusted accounts is a clear win.

Hope you enjoy this small example! Thanks for reading and hope to see you in my next post!

Using Stubs and Mocks in Jasmine to test your Angular code

Setting up your environment and is a very important part of unit testing AngularJS code. We discuss some of the tools and methods under which we can provide the full workflow of a system under test in this post.

Intro

Unit tests allow us to automatically test our code under a variety of conditions to prove that it works and when it breaks it does so in expected ways. In order to predictably test code, we also need to completely control the setup and data provided to the code under test.

Thankfully, the test tools provided for Angular testing allow you to mimic your models and control how your code responds to code in a very precise way. Jasmine provides some easy way to create test doubles and even “spy” on their execution.

In this post we will discuss some of the best practices and tools we use to control the environment your code is working against.

Why do we need to fake stuff?

We need to control the entire environment and workflow to make sure we are testing the exact scenarios.

Messing around with the DB and creating fake records can be difficult at best, at worst it can create false positives. Even more common, some CI systems lack access to a DB at all

We also need the ability to re-create the conditions under which a bug has been created. sometimes this means we need to replicate some crazy conditions

There are two tools to make this happen: Test Stubs (also known as  Fakes, depending on who you ask) and Mock Objects:

  • Test stubs are a very simplified object or data structure designed specifically for the test scenario. These can be constructed at the beginning of the test or per test.
  • Mock objects are class instances that mimic an existing class to provide the same method interface and return specific values when a method call occurs.

Test Stubs in AngularJS & Jasmine

Test stubs are a simple data structure or model we rely on when running our tests. These can be as simple as a static array of data or a very lightweight object with publically scoped methods. To differentiate a stub from a mock, we typically only mimic the methods we are actually testing. This is quite useful during

More often than not a stub is created at the beginning of the test suites and made accessible to the suite’s tests. A good practice is to limit the modifications to the stubs to make sure you are always testing the same thing.

If you must make changes it is a good idea to make a local copy to isolates your modifications to a specific test.

Test Stub Examples

HeaderComponent

Below is an abridged version of the component code we are unit testing. The important thing to note: at the end of this method’s execution we make a call to the router service to navigate the user to the “login” route and we need to make sure this and only this method is executed.

/**
 * Simple component providing a navigation header.
 */
...
export class HeaderComponent implements OnInit {
    ...
    /**
     * clickLogout clears the user's creds but also 
     * takes the user to the Login route
     */
    clickLogout() {
        ...
        this.router.navigate(['/login']);
    }
    ...
}

Test Suite Code

Here is the abridged test suite where we create our test stub

...

/** 
 * beforeEach setup executes before each test suite test runs.
 *
 * This allows us to setup the stub before each test
 */
beforeEach(async(() => {

    ...
    
    TestBed.configureTestingModule({
        ...
        providers: [

        /**
         * Create a very basic stub object with one method:'navigate'
         *
         * Use Jasmine's createSpy to create a very basic function
         * which also allows us to "listen in" when it's called
         */
        {
            provide: Router,
            useClass: class { 
                navigate = jasmine.createSpy("navigate"); 
            }
        }
        ...
    ]})
    .compileComponents();

    ...

}));

/**
 * Here we test the method and make sure we actually navigate
 */
it('should navigate to /login when clickLogout is fired', () => {
   ...
   let router = fixture.debugElement.injector.get(Router);
   component.clickLogout();

   // "listen" to make sure that the navigate method has been 
   // called and it was called with the expected value
   expect(router.navigate).toHaveBeenCalledWith(["/login"]);
   ...
});

 

Mocks

A Mock object is a simulated object instance used to mimic a classes behavior using the same interface. In simpler terms, this is a fake class with the same method signature as the Real Thing.

A mock object can be composed of multiple objects (and sometimes multiple mock objects), but most often should be created as simply as possible to keep your tests easily maintained.

Mock Object Example

/**
 * Create a mock of an existing service
 * by simply extending it and overriding some 
 * of the methods you wish to use in your tests
 */
class MockAuthService extends AuthService {

    /**
     * This method is implemented in the AuthService
     * we extend, but we overload it to make sure we
     * return a value we wish to test against
     */
    isLoggedIn() {
        return false;
    }
}

...

beforeEach(async(() => {

    ...

    TestBed.configureTestingModule({

        ...

        providers: [

        /**
         * Inject our mocked service in place of AuthService
         */
        {
            provide: AuthService,
            useClass: MockAuthService
        },

        ...

    ]})
    .compileComponents();

    ...

}));


it('should navigate to /login when clickLogout is fired', () => {

   /**
    * Get the mocked service here from our fixture
    * and add a spyOn over-ride to pretend we have
    * a logged in user.
    */
   let service = fixture.debugElement.injector.get(AuthService);
   spyOn(service, 'isLoggedIn').and.returnValue(true);
   
});

Conclusion

Unit tests are only as good as the environment you can provide for your code under tests. It’s very important to be able to mimic and recreate the environment under which you have both positive and negative results for your code.

Unit Testing AngularJS Components & Services

Unit testing is a very important part of software development, and it is extremely important to Angular applications as they become increasingly more complex. We’ll explain how to set up your unit tests and describe what you can actually test with some examples.

Intro

Unit testing is a very important part of the software development process, especially as your applications become more and more complex. Making sure your code is robust and adaptable is an important aspect of being professional. This is no less true with your AngularJS application.

If you are reading this, you are most likely already sold on unit testing but here’s some benefits and reasons for writing and maintaining your unit tests:

  • The ability to quickly update your code and make sure you are able to regressively test your code.
  • Updating your unit tests to cover a fixed bug will prevent the bug from getting re-introduced!
  • Use a CI to make sure EVERYONE on your team is testing the code.

Angular CLI, Jasmine & KarmaJS

Here at Arroyo Labs we use the latest version of Angular CLI a lot. It’s a great way to start a project very quickly provides a great start for your application. It also gives you some boilerplate unit tests for each component and service you create using the CLI.

While we have explained Jasmine and KarmaJS in past blog articles, here is a quick overview explaining how these frameworks work together:

  • Jasmine
    • A very popular behavior driven Javascript testing framework with a very clean syntax.
  • KarmaJS
    • A framework that allows you to run your Jasmine tests on a device or headless browser. The test runner itself will provide a DOM with which your code is rendered and provides the API that Jasmine will interact with.

Setting up your tests

To create a unit test, you need to include a consistent environment in which to run your code under test and we need to control the required classes and variables that go into creating this environment. Thanks to the magic of dependency injection, we can mock up some classes used by our classes and code.

Let’s take a look at the ‘top’ portion of a unit test we use in our user-admin project (you can find the whole unit test in our repo):

/**
 * Import some basic testing classes and utilities from AngularJS's
 * core testing code. This is the minimum required code to create 
 * a basic unit test.
 */
import {
     async,
     getTestBed,
     TestBed
} from '@angular/core/testing';

/**
 * Import some specific test classes used to mock the 
 * http backend and connection classes
 */
import {
    MockBackend,
    MockConnection
} from '@angular/http/testing';

/**
 * Import the HTTP classes we use in our service
 */
import {
    BaseRequestOptions,
    Http,
    Response,
    ResponseOptions,
    XHRBackend
} from '@angular/http';

/**
 * Finally, we import the required custom classes we use in our service,
 * and the service we are testing as well.
 */
import { User }         from '../shared/models/user.model';
import { AuthService }  from './auth.service';

import { UsersService } from './users.service';

What should we test in a service?

To test a service, we have to understand the underlying concept they represent: isolating logic and code into a re-usable class. This means we have a class we can pass around in our application where we get and manipulate data in expected ways.

Keep in mind, we can really only test publically scoped variables and functions. The unit test itself is really only able to interact with your service in the same way that an application is able to.

So, we have these things to test:

  • What public variables get initialized when we create the service?
  • Do public variables update as we expect when we call public functions?

Example Service Unit Test: UserService

In our UserAdmin project, we have a service called UserService used to interact with and manipulate user data we get from the AJAX backend.

The service itself only has quite a few public methods (it’s a very important class!) so we will focus on just two of the suite’s tests of the same method `getUsers`:

getUsers should return an empty observable list when the ajax request is unsuccessful

This is a test to make sure we return an empty list of users if the AJAX response does not contain an encoded list of users. This is a negative test, we are making sure that we return an expected response even when the backend returns an error.

it('#getUsers should return an empty observable list when the ajax   request is unsuccessful', () => {

 // set up the mocked service to return an
 // unsuccessful response (no users, 500 status)
 // with a helper method
 usersBodyData.success = false;
 setupConnections(backend, {
     body: {
         body: usersBodyData // use the previously init'd var for consistent responses
     },
     status: 500
 });

 // set up our subscriptions to test results 
 // when we actually get a result returned
 service.users$.subscribe((res) => {
     if(res) {
         expect(res).toBeTruthy();
         expect(res.length).toEqual(0);
     }
 });

 service.total$.subscribe((res) => {
     if(res) {
         expect(res).toBeTruthy();
         expect(res).toEqual(0);
     }
 });

 // make the actual request!
 let res = service.getUsers();

});

getUsers should return an observable list of users and result total when the ajax request is successful

This test makes sure that we get an observable list of users if the AJAX response is successful. To make sure this happens, we mock the response as though we have a JSON encoded list of users.

it('#getUsers should return an observable list of users and result total when the ajax request is successful', () => {
 // set up the mocked service to return an
 // unsuccessful response (users, 200 status)
 // with a helper method
 setupConnections(backend, {
 body: {
 body: usersBodyData
 },
 status: 200
 });
 
 // set up our subscriptions to test results 
 // when we actually get a result returned
 service.users$.subscribe((res) => {
 if(res) {
 expect(res).toBeTruthy();
 expect(res.length).toEqual(2);
 }
 });
 
 service.total$.subscribe((res) => {
 if(res) {
 expect(res).toBeTruthy();
 expect(res).toEqual(2);
 }
 });
 
 // make the actual request, with some params to pass via query string
 // we will explore this in a future post
 let res = service.getUsers(42, 42, 'id', 'desc');
});

What should we test in a component?

A component, like a service, is also a re-usable piece of code but it also has a defined lifecycle from its parent class and a frontend element via it’s template and optional CSS.

So, here is what we have to test:

  • Does this component actually get constructed?
  • What public variables get initialized and what DOM elements get rendered when we create the component?
  • Do public variables update as we expect when we call public functions?

Like we have noted above when discussing testing a service, you can really only test things that have been scoped as public by the component itself. This does include rendered DOM elements and you should test that these elements appear and render as expected since these are made public by the component itself.

Example Component Unit Test: UserListComponent

Test that the component actually results in a rendered component

it('should create', () => {
   fixture.detectChanges();
   const compiled = fixture.debugElement.nativeElement;
   component.ngOnInit();
 
   component.users = [];
   component.total = 0;
 
   // do we have a component at all?
   expect(component).toBeTruthy();
 
   // create new user button
   expect(compiled.querySelector('.btn-info')).toBeTruthy();
 
   // do we have a table
   expect(compiled.querySelector('table')).toBeTruthy();
   expect(compiled.querySelectorAll('tr').length).toBe(2);
});

Test that we display a list of users with a mocked response

it('should display list of users', () => {
   fixture.detectChanges();
   const compiled = fixture.debugElement.nativeElement;
 
   setupConnections(backend, {
       body: {
           body: bodyData
       },
       status: 200
   });
 
   component.ngOnInit();
 
   // create new user button
   expect(compiled.querySelector('.btn-info')).toBeTruthy();
 
   fixture.detectChanges();
 
   // do we have a table with users?
   fixture.detectChanges();
   expect(compiled.querySelectorAll('tr').length).toBe(21);
 
   // do we see the expected page count
   expect(component.getPageCount()).toBe(2); 
});

Conclusion

Unit testing is a not a oft-celebrated portion of the software development cycle, but it is an important part of the process. Once you have good tests in place, you can pivot direction and update your code with an increased sense of security.

Angular 2 VideoJS Component

Using components is a fantastic way to include 3rd party libraries like VideoJS in your Angular 2 application.

We have talked previously about the basics and different types of Angular 2 components, but today we will give you a practical example of using Angular components to wrap an external library. We will focus on the excellent video media player library VideoJS, but this can really be applied to many other libraries as well.

Using external libraries in Angular 2

To actually use some 3rd party libraries within an Angular 2 application, you need to jump through some hoops. Most libraries require direct access to the DOM model and expect full control of the DOM rendering cycle. Part of the magic of Angular 2 is ceding some of that control to allow it to render when it feels like it.

To get around this hurdle you simply need to wrap your 3rd party libs in a way that gives more control to Angular and allows the 3rd party library access to the DOM via the application lifecycle.

Most 3rd party libraries, like VideoJS, would have been wrapped in a directive in previous versions of Angular. In our example we will use a component, but it is possible to use a pipe or a service depending on your 3rd party library.

VideoJS

VideoJS is a (rightfully) popular media player library that makes it easy to skin and control media files. This library can handle many types of media files like MP4s, MOVs and even YT (with a plugin).

Videos have traditionally been a tough element to render consistently across browsers. Given the rather recent support for the HTML 5 video element things have gotten better, but it has always been a little tough to consistently style the player itself. VideoJS both abstracts out the skin and API hooks which make things much easier than in the past.

One of the reasons we chose this library for our example is that it’s pretty self contained. It does not require external libraries like jQuery and is also relatively small.

VideoJS Component

Check out this plnkr to see the example VideoJS component.

First things first: include the VideoJS library assets in the application HTML head so we can use it with our component.

<link href="https://vjs.zencdn.net/5.11/video-js.min.css" rel="stylesheet">
<script src="https://vjs.zencdn.net/5.11/video.min.js"></script>

Our example component is pretty simple, lets take a look at some of the component code itself:

The template code

  <video *ngIf="url" id="video_{{idx}}"
     class="video-js vjs-default-skin vjs-big-play-centered vjs-16-9"
     controls preload="auto"  width="640" height="264">
     
    <source [src]="url" type="video/mp4" />
   
  </video>

This is a pretty simple template, and really only consists of a video element with some CSS classes and a source element.

Some things to note in the template code itself:

Prevent rendering the element itself unless we have a video URL:

*ngIf="url"

Add a unique index to make sure we can reference this element alone:

id="video_{{idx}}"

Add a `source` element for the video url itself:

<source [src]="url" type="video/mp4" />

Initialize the Player in ngAfterViewInit

We need to wait until the component template is rendered before we can let VideoJS do it’s magic. In order to let Angular tell us when the component’s lifecycle has reached that point we use a specific event hook called ngAfterViewInit that fires when the component template has been been completed.

Check out the code below to see this in action:

  ngAfterViewInit() {
    
    // ID with which to access the template's video element
    let el = 'video_' + this.idx;
    
    // setup the player via the unique element ID
    this.player = videojs(document.getElementById(el), {}, function() {
      
      // Store the video object
      var myPlayer = this, id = myPlayer.id();
      
      // Make up an aspect ratio
      var aspectRatio = 264/640;
      
      // internal method to handle a window resize event to adjust the video player
      function resizeVideoJS(){
        var width = document.getElementById(id).parentElement.offsetWidth;
        myPlayer.width(width).height( width * aspectRatio );
      }
      
      // Initialize resizeVideoJS()
      resizeVideoJS();
      
      // Then on resize call resizeVideoJS()
      window.onresize = resizeVideoJS;
    });
  }

Component Element

Include the VideoJS component in your component:

<videojs [idx]="idx" [url]="video"></videojs>

If you check out our plnkr you can see this on the app.ts you can see we support multiple video elements as well.

Conclusion

Components are a great way to help separate your coding concerns, and are a major building block in creating an Angular 2 application. They are also great to encapsulate and add some sanity to 3rd party libraries that you wish to use in your projects!
If you have any comments concerns or questions, please let us know in the comments!

[Update 06/12/2017]

I have updated the plnkr code to show how to properly import the VideoJS library into your component. Thanks for the feedback!

oAuth – Part 1

OAuth 2.0 is the industry-standard protocol for authorization.

Overview

Welcome, today I will go through the oAuth protocol, some definitions and how it works. In a later post I will show an example of how easy it is to use oAuth with the Erdiko Authenticate package and php league packages.

What is oAuth?

I bet most of you have the same question. We can simply say that it is a protocol definition for authenticate and authorize third party applications to have access to certain user data, without sharing password with the third party app.
In other words, your application will have a token provided from a trusted service (let’s say Facebook, Twitter or any other commercials) once a user has authenticated. That way, you delegate the authentication process to this third-party server. Additionally, with this token you can access to some user’s information, previously granted by user, e.g. profile information.
When using this method we can store the token so our application doesn’t need to remember any extra authentication credentials, don’t need to fill profile fields every time register in a website, relay the security in a Trusted System.

OAuth 2.0

As per oAuth 2 site:
OAuth 2.0 is the industry-standard protocol for authorization. OAuth 2.0 supersedes the work done on the original OAuth protocol created in 2006.

The new version focus on client develop simplicity, and provides specific flows for Web, Desktop and Mobile developments.

Components

Components or oAuth Roles, are the main pieces involved in the auth flow.
  • Client
  • Resource Server
  • Authorization Server
  • Resource Owner
Now let’s break them down:
Client (The Third-Party Application)

Is the application that is attempting to get access to the user’s account. It needs to get permission from the user before it can do so.

Resource Server (The API)

This is the server or service used to access user’s information through an API.

The Authorization Server

This server presents the interface where the user approves or denies the request. In smaller implementations, this may be the same server as the API server, but larger scale deployments will often build this as a separate component.

Resource Owner (The User)

The resource owner is the person who is giving access to some portion of their account.

Protocol Flow

  1. The application requests authorization to access service resources from the user
  2. If the user authorized the request, the application receives an authorization grant
  3. The application requests an access token from the authorization server (API) by presenting authentication of its own identity, and the authorization grant
  4. If the application identity is authenticated and the authorization grant is valid, the authorization server (API) issues an access token to the application. Authorization is complete.
  5. The application requests the resource from the resource server (API) and presents the access token for authentication
  6. If the access token is valid, the resource server (API) serves the resource to the application

You can implement this abstract flow in different ways based on the target architecture: Web Application and Client Application. Where the first one could be any of common web sites that allow us to login using our Facebook or Twitter user, and the last one could be a mobile or desktop app.

In both cases, before start using oAuth, you need to register the application in the Service Provide system, that way you will obtain a client_id and client_secret. The client_secret should be kept confidential and only used between application and Service Provider.

Once you have registered your app,  the first step is to get authorization from the user.

This is usually accomplished by displaying an interface provided by the service to the user.

The version 2 provides several “grant types” for different use cases. The grant types defined are:

  • Authorization Code for apps running on a web server, browser-based and mobile apps
  • Password for logging in with a username and password
  • Client credentials for application access
  • Implicit was previously recommended for clients without a secret, but has been superseded by using the Authorization Code grant with no secret.

The first step is create a login link to redirect a user to Auth Server where will be authenticated. This link might include the callback URL (this could vary depends on server implementations, some systems require the callback URL in the app registration form). This is the URL user will be redirected to after login.

In case of success this redirect will include a Grant Code, that will be needed in the Token Exchange step.

After a succeed login, with Grant Code and Access Token the client app, will request the access token, that will be used as credential on future request.

Final thoughts

Hope this roughly overview brought up some light into the oAuth Protocol concepts. In the next episode, I will bring you a practical example of how to implement a client module with erdiko/authenticate package.

Testing a REST API with Frisby.js

When our web app depends on pulling data through a REST API, we need to test that the endpoints well. Thats where things become more complicated and tedious to do manually. Two great libraries, Jasmine.js and Frisby.js, can help us out here.

What is Frisby and how does it work?

We are software developers and we strive to write good code.  We try to ensure the quality of every line and workflow which in turn means testing.  Manual testing works to a point but without automated testing of REST APIs and Ajax endpoint regression testing becomes cumbersome.  Are you really going to test all the possibilities/permutations with Postman?  I didn’t think so.

Testing a REST API is not so different to testing modules, but to create “end-to-end” test cases we needed make real endpoints calls. A manual testing job quickly becomes a tedious task, so we need to find a way to make things automated.

Frisby is a REST testing framework built on node.js and jasmine, and makes testing API endpoints a very flexible and easy task.

As any normal npm package running in node, Frisby needs node.js installed first, after that you can install Frisby as a common package:

npm install -g frisby

By convention, the file name containing the Test Suite, should include the trailing word “_spec.js”, node.js runs in javascript so logically js is the extension.

In order to continue our intrduction we will study the roleAPI_spec.js, a test suite part of the package Users of Erdiko Framework:

var frisby = require('../node_modules/frisby/lib/frisby');

This is the “kickoff” for our test suite, we instance the library in a Frisby variable, it will be our object to along  the suite.

describe('Role api test suite', function() {
  var baseURL = 'http://docker.local:8088/ajax/users/roles/';
  /**---This case test the api response and check the result----*/

  frisby.create('Get all roles')
        .get(baseURL + 'roles')
        .expectStatus(200)
        .expectHeader('Content-Type', 'application/json')
        .afterJSON(function (response) {          
            expect(response.body.method).toBe('roles');
            expect(response.body.success).toBe(true);          
            expect(response.errors).toBe(false);
            expect(response.body.roles).toBeDefined();
        })     
        .toss()
});

All  Frisby test starts with frisby.create(‘A gently worded description of test’) and next we must specify the verb to make the call to our endpoint. It is obvious but must match with the endpoint structure.

In our case we used .get(baseURL + ‘roles’)

Differences between using post or get.

There are differences between verbs post and get. When we create post test cases, we should pass objects as arguments, let’s see an example grabbed from userAPI_spec.js from Users package of Erdiko framework:

/*---------------------------------------------------------*/
/*-----------------Create User success --------------------*/
frisby.create('Creation will success.')
 .post(baseURL + 'register',
 {"name":"NameTest",
  "email":"test@email.com",
  "password":"secret"},
 { json: true },
 post_header)
 .expectStatus(200)
 .toss()

By default, Frisby sends POST and PUT requests as application/x-www-form-urlencoded parameters. If we want to send a raw request content, we must use { json: true } as the third argument to the .post().

Using Expectations.

/**--checking creation with expectations --*/
frisby.create('Checking user created exist.')
    .get(baseURL + 'retrieve?id=' + USER_ID)
    .expectStatus(200)
    .afterJSON(function (response) {
        expect(response.body.method).toBe('retrieve');
        expect(response.body.success).toBe(true);
        expect(response.body.user.id).toBe(response.body.user.id);
        expect(response.body.user.email).toBe(newUser.email);
        expect(response.body.user.name).toBe(newUser.name);
        expect(response.body.user.role.id).toBe(newUser.role);
    })
   .toss()

In comparison with our traditional phpunit, the framework to create unit test cases in php, we could say that phpunit Asserts are analog to Frisby Expectations. In particular, we are interested by that expectations to handle json code. Because that we are use the ‘expect’ clause that compare the API response with a parameter provided by us:

expect(response.body.method).toBe('retrieve')

Here we expect that ‘response.body.method’ be equals to the string ‘retrieve’. It’s important to mention that the comparison is by type and value at same time and both should match to result ‘true’.

After explain that we have a few expectations used:

  • expectJSON( [path], json )
  • expectJSONTypes( [path], json )

Managing Headers, afterJSON()

/**-----------delete user created ----------------*/
frisby.create('removing user created.')
    .get(baseURL + 'cancel?id=' + USER_ID)
    .expectStatus(200)
    .afterJSON(function (response) {
        expect(response.body.method).toBe('cancel');
        expect(response.body.success).toBe(true);
        expect(response.body.user.id).toBe(USER_ID);
    })
   .toss()

As we said previously, Frisby is an extension of Jasmine.node, and afterJSON is a very good evidence because afterJSON is the extension of after() with the convenience of parse the body obtained as response as JSON values automatically. The ‘after’ occurs immediately after the response is sent from the endpoint call and is used as a callback.

Using inspectors, inspectJSON()

Inspector helpers are useful for viewing details about the HTTP response when the test does not pass, or has trouble for some reason. They are also useful for debugging the API itself as a more user-friendly alternative to curl.

// Console output
{ url: 'http://theurl/get?foo=bar',
    headers:
    { 'Content-Length': '',
        'X-Forwarded-Port': '80',
        Connection: 'keep-alive',
        Host: 'httphost.org',
        Cookie: '',
        'Content-Type': 'application/json' },
    args: { foo: 'bar'},
    origin: '127.0.0.1' }

Basically inspectJSON() prints the raw HTTP response, other similar and general inspectors are:

  • inspectBody()
  • inspectHeaders()
  • InspectRequest()
  • InspectStatus()

Conclusion

Frisby makes it possible to write end-to-end tests with a lot of flexible tools and the process of test a complete REST API turns fun and easy. We miss the ability to ‘serialize’ test cases as in a regular test unit framework. Given the nature of javascript its possible to chain nested test cases and mimic, in some aspect, that behaviors.

We haven’t tried Newman yet, but hope to soon.  Let us know about your Frisby (and Newman) experiences.

Introducing the Erdiko User Admin

Introducing the Erdiko User Admin Package! A user administration package built with an Erdiko powered backend and a Angular 2 frontend.

We’re excited to introduce the Erdiko User Admin! A modular package that provides an attractive UX to allow you to manage your users.

This is still very much a work in progress, but since we’re so excited, we want to let everyone know about this project (and request some help to keep us moving).

Check out our User Admin Project at this Github Repository and our Packagist entry.

Project Features

We wanted to create a modular and simple User Admin you can ‘bolt-on’ a new or existing project. We keep this as lean and mean as we could, but still providing some cool stuff to help get you started.

While this package is still very much in development, it does provide some of the following features:

  • A sortable and paginated list of user records.
  • An attractive user interface allowing the user to create, edit & delete user records
  • JWT user authentication!

Here’s some screenshots of our UX:

Installation and Setup

The User Admin package is a part of the Erdiko module ecosystem. Installable and upgradable via Composer. This makes it very simple to start a new project via this simple command.

composer create erdiko/user-admin [PROJECT NAME]

Please note that as of this time, we have not created an official release. You will need to include the minimum stability flag when you create your project:

composer create erdiko/user-admin [PROJECT NAME] --stability DEV

Package Dependencies

Here’s a brief list (and a bit of a plug) of some of the other Erdiko Packages we as a team have been developing that we use to build this project. While we use these packages for our development, we have planned to make these as modular and replaceable as we can.

Here’s the list:

  • erdiko/core
    • The base package that provides our basic routing and templating.
  • erdiko/authenticate
    • Authenticate the user’s credentials to assert they are who they say they are.
  • erdiko/authorize
    • A package that helps us enforce some user roles.
  • erdiko/users
    • Our Erdiko package that provides the backend storage and basic routes we use to interact with these models.

AngularJS 2

We utilize the great Angular CLI project to start and maintain the Angular code itself. This project was a great boon to help us get started on a well structured and maintainable Angular application.

The internal structure is “just” a simple app, but we have updated the NPM script to compile and move the resulting files to an directory accessible to the Erdiko application Home route.

NPM Run Scripts

Here is a quick list and an explanation of some of the custom NPM run commands we have for this project.

  • Start the local Angular Development server: npm run start
  • Run the unit tests: npm run test
  • Run the e2e/functional tests: npm run e2e
  • Compile and export files for end user: npm run build

Next Steps

Here’s an incomplete list of some of the next things we plan on working on, and completing, with the project in the future:

  • Jasmine Unit Tests to cover our angular code
  • KarmaJS Functional Tests to cover the entire application end-to-end
  • A basic User facing profile designed for user extension

How to Contribute

If you have an idea for a new feature, have an idea on how to improve a feature, or (gasp) you have found a bug please report this on our Github Repo Issues page.

However if you are just excited about this project, feel free to “star” this repo on Github to show your support.

We would also love to know if and how you use this module in your own projects!

To set your environment up for local development, please follow these steps:

  1. Clone your fork of the User Admin project into a local directory
  2. Clone the following packages into the same directory
  3. Copy the composer-dev.json file to composer.json
  4. Start your docker container docker-composer up --build

Conclusion

Thanks for reading about our exciting new package! Feel free to leave a comment or ask questions below, or feel free to comment on our Github Repo page!

Themes in the Erdiko Framework (Part 2)

In the first post we talked about how themes works in the Erdiko framework and covered important files to configure it. We continue with more advanced Erdiko theme topics and introduce new classes to discover more features, including layouts.

Today, I’m going to show you more concepts around themes in the Erdiko framework.  There are more concepts and classes to know and understand.

Layouts

To introduce the Layout class we need to define what is a layout: it includes the HTML for a basic web application structure in order to provide your own content. After made this clarification, we realize that a layout could be made by one, two, three or whatever number of columns.

Let’s look at the next code from Erdiko framework:

/**
 * Get two column layout example
 */
public function getTwocolumn()
{
    // Set columns directly using a layout
    $columns = array(
        'one' => $this->getView('examples/one'),
        'two' => $this->getView('examples/nested_view')
        );
    
    $this->setTitle('2 Column Layout');
    $this->setContent($this->getLayout('2column', $columns));
}

This code is an action from Example controller from code base of Erdiko framework. We can see how passing an array of views we can get a new layout built with ‘2column’ parameter. This way we instruct to Example controller what design we want to set to our application, and the final result will be something like this:

2columns

It’s important to take care about this latest parameter, it must exist as layout in the layout’s folder and the name must match exactly. Let’s recall our Erdiko folder structure:

layouts

the key method is Controller->getLayout():

 $this->setContent($this->getLayout('2column', $columns));

Under the hood, the controller delegates to a new Layout object composed of 2 views (one for each column) the call to render html code:

return  $layout->toHtml();

Pages

In order to reuse code the framework gives the chance of include html code often used, i.e. Header and Footer. This kind of content normally stay static around  the application. When we say “static” talk about structure however we also could include mustache code within the pages by using the {{ variable }} sintaxis.

Header

The header is a good example of html code that probably will not change in our web app. Erdiko framework installation provides us a nice default html code in the file:
app/themes/bootstrap/templates/pages/header.html

Something interesting to comment is the next piece of code:

<div class="navbar-collapse collapse" id="navbar-main">
  <ul class="nav navbar-nav">
    {{# menu.main }}
        <li>
            <a href="{{href}}">{{title}}</a>
        </li>
    {{/ menu.main }}
  </ul>
</div>

Ok, what is that sort of tag “menu.main”? Don’t worry just let’s say that we can create a html menu programmatically. When we talk about “Regions” will expand a better explanation. One important thing to highlight: the menu is built from the key “menu” in application.json file.

Footer

Footer section is more or less the same concept we talked previously with “header” but applied to the footer of application. Erdiko framework provides a default footer, the big difference against Header is the region used:

<div class="col-lg-12">
        <ul class="list-inline">
            <li class="pull-right"><a href="#top">Back to top</a></li>
            {{# menu.footer}}
                <li><a href="{{href}}">{{title}}</a></li>
            {{/ menu.footer}}
        </ul>
        <p>{{site.copyright}} {{site.full_name}}<br />
           Powered by <a href="https://github.com/ArroyoLabs/erdiko"
                         target="_blank">Erdiko</a>
        </p>
</div>

This time, the region used is ‘main.footer’ and the key used from application.json is “menu.footer”.

FlashMessages

To conclude with pages, another great and useful example is FlashMessages that provides (and being redundant) a flash message on top of the header page and denotes something important to show to the user application.

Unlike Header and Footer, FlashMessages is a helper, and you can call it:

<?php $messages = \erdiko\core\helpers\FlashMessages::get() ?>

‘$messages’ is an array and you are able to loop around it in order to give html structure:

<?php foreach($messages as $message): ?>
   <div class="alert alert-<?php echo $message['type'] ?> 
     alert-dismissible" role="alert">
     <button type="button" class="close" 
             data-dismiss="alert" aria-label="Close">
                <span aria-hidden="true">&times;</span>
    </button>
    <?php echo $message['text'] ?>
   </div>
<?php endforeach ?>

If you can get messages, it’s logic that you can set messages too 😉 :

<?php 
$message ='This is a success message to show.';
//what color we want to show the message? 'danger' is by default.
$type = 'success';
\erdiko\core\helpers\FlashMessages::set($message,$type);
?>

Regions

In the first post when we talked about Mustache we understood how a content is rendered exactly in the right position into the html code. In order to conceive Regions, we can take the same concept, regions is just a “mark” in the html code to teach the framework about what kind of html structure we want to create, just in the right position where should be placed.

Let’s take the region part of footer.html:

 {{# menu.footer}}
     <li><a href="{{href}}">{{title}}</a></li>
 {{/ menu.footer}}

When the framework process this region tag, as result we will get a nice footer created programmatically. The million dollar question is:
– How the framework makes possible to know about footer structure?
Easy, Erdiko reads the section ‘menu.footer’ and iterates the key/values within  in application.json file.

Scripts and Styles

Previously, in our post “sample application using Erdiko Framework” we proposed as example a little dice to roll with a very simple view. That view used the controller method addJs() and addCss() in order to add css and js code to the view respectively. The method signature  is the next:

/**
 * Add Css includes to the page
 *
 * @param string $name
 * @param string $file
 * @param int $order
 * @param int $active
 */
public function addJs($name, $file, $order = 10, $active = 1)
{
    $this->getResponse()
        ->getTheme()
        ->addJs($name, $file, $order,$active);
}

Both methods share the same order and number of parameters.

Views

A view object is a piece of code to provide a visual representation of model data, in object terms is a very simple class where we can modularize our html code. Let’s take an example from Erdiko framework:

$this->addView('examples/about', $data);

here we add the view ‘examples/about’ to the  controller response. The views can receive parameters, here is the constructor:

public function __construct($template = null, $data = null, $templateRootFolder = ERDIKO_APP)
{
    $this->initiate($template, $data, $templateRootFolder);
    $this->setTemplateFolder('views');
}

by reading this code we can understand easy, what kind of parameters a view needs, look how the folder ‘views’ is assigned by default.

There is no more to talk about Views, the content will be rendered by the controller with addView() method internally, calling to:

$this->appendContent($view->toHtml());

Using properties

There are a set of  properties  defined and accessibles to the entire application, we can use it between {{ variableName }} in views or templates because are just keys/values already created in application.json config. file. To mention a few of them:
– site.name
– site. description
– site.copyright

Default.php

Default.php is, paradoxically, the default template theme applied to a given view. In a general manner, default.php is a complete html page with formal sections (head, body, etc) but instead make use of static code, it call the previously defined dynamic pages as header, footer and messages. Let’s see a fragment just to understand how the sections are referenced:

<?php echo $this->getTemplateHtml('header'); ?>
<?php echo $this->getTemplateHtml('messages'); ?>
<?php echo $this->getContent(); ?>
<?php echo $this->getTemplateHtml('footer'); ?>

where the method call:

$this->getContent()

what we want is to render the View code itself.

Conclusion

Erdiko framework provides a very flexible solution to easily change the aspect of a complete web application as complex as we propose it, the key to achieve that is setup properly all our config. files and templates.
Thanks for reading!