Welcome to the API World


    This article is about how to jump-start your Mule API development journey. It is assumed that you already know the prerequisites and have the fundamental Mule knowledge. If you're quite new to MuleSoft, API development, or integration in general, I would suggest to go through the official Mule documentation, which would greatly help you get a deep-dive about the platform. If you're entirely new to MuleSoft, I hope that this article will give you a conceptual overview about the platform.

    The question I initially had when I was starting Mule development was "Is there a specific, standard, and consistent way of developing Mule APIs?" The answer is either YES or NO. 

    YES, because it establishes best practices, enables rapid development, and allows you to get past the "boilerplate codes" and be able to move on to the more important implementation of the business logic and requirements. Consistency also means readability in a way that other developers who would look at the code will be able understand and navigate through it seamlessly.

    NO, because every project, company, or organization is different. Each has its own unique requirements, infrastructure, settings, and standards, which are customized and tailored for their own use cases. There is no "prescribed" way of doing things; hence the expression goes "there are so many ways of killing a cat." [ but why would you kill an innocent cat? ]

    Moving on, the first step that I usually do upon starting a new Mule API project is the API Specification. A good software development practice starts with a good design. The first step is always creating the RAML. RAML stands for the RESTful API Modeling Language. This also assumes you have a basic knowledge about the RAML. For more information, you can read through https://raml.org/

Always Start with the API Design and Specification

    RAML is both machine and human readable, which means you can understand it just by reading it as a "human". You don't have to have super-human skills to read or create it. Mule supports RAML by default for the API specification, though there are other API specifications as well, such as the OpenAPI Specification (formerly known as Swagger). In my opinion, the RAML can be a non-technical part of the API development, which means that a designer or business analyst should be able to create it. Mule also provides a tool called Anypoint Design Center to create and publish the RAML to Anypoint Exchange, create a mock service endpoint for end-users to test it, and get some feedback for changes or improvements. Basically a RAML defines the resources and supported methods or actions, request and response data (or media) types such as XML or JSON, required parameters, response HTTP statuses (200, 400, 500), security for authentication (Basic, OAuth), among others. It is important to get a thorough and deep understanding of the RAML. In object-oriented programming terms, the RAML can be compared to a Java interface.

    Create your Mule project in Anypoint Studio

    Anypoint Studio is an Eclipse-based integrated development environment for creating Mule API projects. In the latest version of Anypoint Studio (7.5 and above), there is already an option to import your RAML directly from Anypoint Exchange. That means that you don't need to synch the RAML files from both your source code control and the Design Center. I think this update is really great as it separates the developer and designer aspects of creating the RAML. That is why I mentioned earlier that creating the RAML can be part of the designer or business analyst responsibilities. When you import the RAML from Exchange, it becomes a Maven dependency added in your POM file.

Generate your flows from RAML

    Flows are the basic processing units of a Mule application. Each flow has an event source (the event that triggers its execution), processor (what it does), and optionally its own error handler. In Anypoint Studio you can automatically generate the flows from your RAML specification (right-click magic!)

    Each method of each resource will have its own flow. Mule will know how to route the events to the corresponding flows for each resource/method combination using the APIKit Router. This is why your flows do not have an event source. The APIKit Router takes care of routing the events to the right flows.

Now you're ready to implement your API integration flows!!

    As a developer you're now ready to implement the integration flows and business logic. MuleSoft provides hundreds of connectors to external systems such as databases and other enterprise systems, as well as transformers and other functionalities, which are added as Modules that are basically libraries that contain a variety of functions. Think about it as reusable packages that lets you access functions in a declarative way.

Global Configurations and Environment Properties

     Each connector optionally has global configurations which are reusable settings. For example, you have a Web Service Consumer, which calls an external SOAP web service on a particular host, port, and URI path. You can configure a Global Configuration to just define these settings "once" and reuse the same settings across your code. 

    How about if you have different settings per environment? Supposed you have Non-Production and Production settings. You can have environment-specific settings by using Configuration Properties. These settings are defined in a separate .properties or .yaml file. In Mule 4, I would prefer to use a YAML format as it is more readable and has a hierarchical structure. You can then define a deploy-time settings that points to your deployment environment (SIT, UAT, or PROD). The deploy-time settings are passed as a command-line argument in the Mule Runtime. Anypoint Platform provides an easy way to set deploy-time properties, either via the web interface or the Maven plugin, depending on your deployment strategy.

    CloudHub Deployment Strategy

     CloudHub is the iPaaS (integration platform as a service) offering of Mule. It allows you to deploy and run your API applications in the cloud, without setting up and maintaining your own servers in your data centers. It has zero-downtime, is highly-available and fault-tolerant (it detects non-responsive apps and restarts them automatically). No infra, no problem! You don't have to install software, and configure or maintain servers, networks, and load balancers in order to deploy your APIs. It is a great option for enterprises looking to optimize their business at the fastest turnaround time as much as possible. Of course there are also other deployment strategies that would cater to the size of the business. Mule does not prescribe but leaves it to you to decide and choose.

Automate your CloudHub Deployment using Maven Plugin

     Mule enables automated and continuous deployment to CloudHub using the Maven plugin. It is a build plugin that you define in your POM file. POM is the Project Object Model for Mavenized projects, which basically lets you define your project configurations such as the project/artifact name, version, dependencies, build configurations, and other project settings. Mule project is basically a Maven project. You can define the CloudHub deployment configuration in your POM file. Maven plugin can also be used for other deployment strategies, which I would try to tackle in my next blog.

Securing Your Properties

    Mule provides ways to secure your properties, especially if you're deploying to the cloud. You can achieve this by using Secure Configuration Properties. You can also secure deploy-time settings by defining them in the mule-artifact.json file. This means that the properties will be encrypted in Anypoint Platform. It's one way to protect sensitive data such as your API keys, passwords, credentials, among others.

Automate your Unit Test using Munit plugin

     Finally, the last part of this long post. Mule provides automating your unit tests (also called white-box testing) using the Munit plugin. Why do unit test? Unit tests are essentially for regression testing that checks your code and prevents breakages when introducing new code changes. Regression testing ensures that your code did not break by accident or by mistake. For example, if you have already tested your code successfully, and then someone from your team merged new code and has overwritten the existing code accidentally, Munit tests will provide a certain kind of "pre-checks" before deploying the code. This is quite useful when implementing a continuous deployment to Production. Another way that Munit tests will be useful is when you want to mock the connections to an external database or system, which is inaccessible in your local environment. You can mock the responses to the system without actual connections to the system in order to test your integration flows and other internal aspects of your code such as your configurations, scripts, configuration properties being read successfully, required configuration parameters are provided, among others. I would highly recommend adding a Munit test suite in your Mule projects, although they're not the most exciting things you can do and sometimes take more time than necessary. However, unit testing your code is always essential!

If you've just started your Mule API development journey, you can check out my GitHub page https://github.com/ralph-palomar/api-starter-pack.

You can clone and examine the example Mule 4 application project I have created to help you understand better the concepts I have discussed in this blog post. Feel free to add comments in this post and I would appreciate your responses on ideas and topics I can share in this blog! :)

Comments

Popular posts from this blog

XML Schema and JSON Schema Validation in Mule 4

Parsing a JSON String and Converting it to XML in TIBCO

Using XML To Java in TIBCO BW