How to become an AWS Certified Developer

AWS Certified Developer AssociateBecoming an AWS Certified Developer Associate will allow you to showcase your conceptual knowledge of AWS services with others, and give you an edge in today’s modern era of cloud computing. The weighted, multiple-choice exam which lasts 130 minutes, contains 65 questions that test your understanding of the AWS Platform, from a Developers perspective. (See the “Exam Resourceshere for more info). You should be familiar with:

  • How to encrypt / decrypt secure data
  • Using IAM policies, identities vs ACLs
  • Why and how to utilize KMS
  • Web Identity and SAML Federation
  • User authentication & authorization
  • What a VPC is and how it can be used
  • Shared Responsibility Model
  • Important Service limits
  • Horizontal (auto) and vertical (compute) scaling
  • Deploying services through CI/CD systems
  • Elastic Beanstalk deployment strategies
  • How to achieve redundancy
  • Best practices in design and refactoring
  • Read/Write Dynamo Capacity Unit calculations
  • Service APIs
  • The following services and how they can be used together:
    • API Gateway
    • Elastic Beanstalk
    • CloudFormation
    • CloudWatch
    • Cognito
    • EC2
    • ELB
    • Elasticache
    • IAM
    • KMS
    • Lambda
    • Kinesis
    • DynamoDB
    • Code Commit / Build / Deploy / Pipeline
    • Step Functions
    • S3
    • SNS
    • SQS
    • SWF
    • STS
    • X-Ray

The exam is not easy and rote memorization without experience and understanding of AWS services will guarantee failure — you need a 740 / 1000 to pass. If there is a particular AWS service you have not used, it is highly recommended to dive in and experiment with using it, while also taking into account how it can be used with other services. For example:

A) As a  developer you could use CloudFormation to provision a stack with a DynamoDB NoSQL DB, and an EC2 M5 instance as the server hosting your web service instrumented with X-Ray and using Cognito to manage user identities. All with the proper IAM policies in place.

or

B) As a developer you could deploy a Lamda package stored on S3 with API Gateway as the primary endpoints/event triggers and configure CloudWatch to send a message to SNS topics that will notify subscribers if certain a certain metric threshold has been reached. All with the proper IAM policies in place.

When preparing for the exam, it is recommended to take the practice exam from the AWS training site to familiarize yourself with the question format. This method is in contrast to taking practice exams from other 3rd party training sites — whereby the questions are limited to the course’s range and are often of insufficient difficulty. If you encouter any questions that don’t know or are too broad (there are often multiple answers but 1 “best” answer) take a step back and review that area to gain a better understanding.

After you pass the exam, you can share your recognition with the world by generating a badge, email signature, or transcript in the AWS Certification portal (Certmetrics). Additionally you can buy branded swag, gain access to AWS Certification Lounges at events, recieve discounts on future exams, and more

Good luck on your journey to becoming AWS Certified Developer Associate!

Testing email verification with Google Apps Script

Automated email verification is something that can help streamline testing by circumventing the need for manual intervention. In cases where you have control of the email provider and recipient, it is possible to use an API to interface with this email account. This post will address how to perform do this email verification with Google Apps Script.

Conceptual Overview

What we want to do is access a target Gmail account and perform CRUD operations on the emails. But how?

  1. Google Apps Script will provide us with a public customizable API proxy to perform those CRUD operations.
  2. Our client consumer will interact with the deployed Google Apps Script API proxy, executing the correct HTTP method as well as the request parameters necessary to operate on the Gmail account. A response should be returned so that the client can identify the result of the operation.

Setup

  • Create a new Gmail account that will be used with automation.

    Create Google Account
    Create Google Account

Design

  • Navigate to Google Apps Script and develop your application. For our purpose, we will simply need 1 post method that will take a JSON body in the parameters:
    • emailCount — How many of the most recent emails to check
    • subjectPattern — Regex pattern that the email subject should match against
    • dateAfter — Dates after this will be included as emails to check (ISO 8601)
    • timeout — How long in seconds should we wait to check the emails

The editor will provide you with auto-completion. See this page for the complete Apps Script Reference. In addition, you can enable more API’s using Resources > Advanced Google Services.

Keep in mind, this is YOUR API proxy around the facilities that Gmail provides, you can perform way more capabilities than what is seen here.

/**
/**
 * Process unread emails and return latest match (stringified json)
 * according to subject Regex after marking it as unread
 * Waits n Seconds until a non-empty response is returned
 *
 * {
 * emailCount = Integer
 * subjectPattern = "String.*That_is_regex.*"
 * dateAfter = Date.toISOString()
 * timeout = Integer (seconds)
 * }
 */
function doPost(e) {
  var json = JSON.parse(e.postData.contents);
  
  var emailCount = json.emailCount;
  var subjectPattern = json.subjectPattern;
  var dateAfter = json.dateAfter;
  var timeoutMs = json.timeout * 1000;
  
  var start = Date.now();
  var waitTime = 0;
  var responseOutput = {};
  
  while(Object.getOwnPropertyNames(responseOutput).length == 0 && waitTime <= timeoutMs ) {
    responseOutput = controller(emailCount, subjectPattern, dateAfter);
    waitTime = Date.now() - start;
  }
  
  return ContentService.createTextOutput(JSON.stringify(responseOutput)); 
}



function controller(emailCount, subjectPattern, dateAfter) {  
  var responseOutput = {};
  
  for(var i = 0; i < emailCount; i++) { // Get the msg in the first thread of your inbox var message = GmailApp.getInboxThreads(i, i + 1)[0].getMessages()[0]; var msgSubject = message.getSubject(); var msgDate = message.getDate(); // Only check messages after specified Date & Subject match if(msgDate.toISOString() >= dateAfter) {
      if(msgSubject.search(subjectPattern) > -1) {
        if(message.isUnread()){
          GmailApp.markMessageRead(message);
          
          responseOutput = getEmailAsJson(message);
          break;
        }
      }
    }
  }
  
  return responseOutput;  
}



function getEmailAsJson(message) {
  var response = {};
  
  response["id"] = message.getId();
  response["date"] = message.getDate();
  response["from"] = message.getFrom();
  response["to"] = message.getTo();
  response["isRead"] = !message.isUnread();
  response["subject"] = message.getSubject();
  response["body"] = message.getBody();
  response["plainBody"] = message.getPlainBody()
  
  return response;
} 

When you are done, save your script.

Publish

  • Deploy your script as a web app to act as a Proxy. You will need:
    • Project version to deploy into (with commit comment)
    • Who this app will execute as — Basic security
    • Who has access to the app — More security

      Deploy Script as Web App
      Deploy Script as Web App

After continuing, your app should be deployed to a public Google Apps Script URL, which you will access as your API Proxy. Copy the endpoint URL you will use it next.

Run

  • Test it out! For this, I’ve accessed the web app endpoint with the following json body to find the 1st latest email after 05/20/2019 7:14:19 UTC
{
	"emailCount": 1,
	"subjectPattern": ".*",
	"dateAfter": "2019-05-20T07:14:19.194Z",
	"timeout": 5
}

As expected, the latest email was returned in a JSON request that also includes some metadata. It also marked the email as read, so subsequent requests will not reprocess it — all as specified in our script. Super!

Postman API post request to Google App Script
Postman API post request to Google App Script

Integrate

  • With our client app, we may have something like this which works on our Google Apps Script
package com.olandre.test.email;

import io.restassured.RestAssured;
import io.restassured.response.Response;
import org.json.simple.JSONObject;
import org.openqa.selenium.*;

import java.util.HashMap;
import java.util.Map;
import java.util.regex.Matcher;
import java.util.regex.Pattern;

public class Email
{

    public static final String servicedGmailFullCapabilitiesEmail = "emailuser1@gmail.com"
    public static final String servicedGmailFullCapabilitiesService = "//script.google.com/macros/s/FAKEGOOGLEAPPSSCRIPTURL/exec";

    public Email()
    {
    }

    public static String getCurrentMethodName()
    {
        return Thread.currentThread().getStackTrace()[2].getClassName() + "." + Thread.currentThread().getStackTrace()[2].getMethodName();
    }

    public String processNewMemberSignupEmail(String email, Integer timeout, String emailSearchPastDate, String firstName,
        String lastName) throws Exception{
        final String NEW_SIGNUP_SUBJECT = String.format( "Welcome %s %s!", firstName, lastName);
        final String SIGNUP_CONTINUE_LINK_REGEX = ".*=\"(http.*/signup/.*)\" target.*";
        final String PLAINTEXT_SIGNUP_TEXT = String.format(
            ".*(Hi, %s! Welcome Aboard .*To sign up, you'll need to create a password.*html.*/register/).*", firstName);

        Request request = new Request();
        Response response = request.checkEmail( email, NEW_SIGNUP_SUBJECT, emailSearchPastDate, timeout );
        return request.findEmailClickthroughLink(
            response, SIGNUP_CONTINUE_LINK_REGEX, PLAINTEXT_SIGNUP_TEXT );
    }

    /**
     * When we decide to add headers, and other metadata
     * to the request, outsource and turn into a generated builder Class
     */
    public class Request {

        private Map<String, String> emails;

        Request() {
            emails = new HashMap<>(  );
            emails.put(servicedGmailFullCapabilitiesEmail, servicedGmailFullCapabilitiesService);
        }

        public Response post(String url, JSONObject body) {
            Response preRedirectResponse = RestAssured.given()
                                                      .redirects().follow( false )
                                                      .body( body.toString() )
                                                      .when().post( url );

            String location = preRedirectResponse.getHeader( "Location" );

            return RestAssured.given()
                              .cookies(preRedirectResponse.getCookies())
                              .when().get(location)
                              .thenReturn();
        }

        /**
         * {
         *   emailCount = Integer
         *   subjectPattern = "String.*That_is_regex.*"
         *   dateAfter = (ISO 8601 Date"2018-05-10T17:24:58.000Z")
         *   timeout = Integer (seconds)
         * }
         * @param
         * @return Response
         */
        public Response checkEmail(String email, String subjectPattern, String emailSearchPastDate, Integer timeout) throws Exception {
            String serviceURL = emails.get( email );

            HashMap<String, Object> model = new HashMap<>(  );
            model.put( "emailCount", 10 );
            model.put( "subjectPattern", subjectPattern );
            model.put( "dateAfter", emailSearchPastDate );
            model.put( "timeout", timeout * 1000);

            JSONObject json = new JSONObject(model);

            Response response = null;
            if (serviceURL != null) {
                response = post( serviceURL, json );
                if(response.getBody().asString().equals( "{}" )) {
                    LOGGER.warn( "[ FAIL ] Did not find response data using request: " +
                                       json .toJSONString(), getCurrentMethodName());
                }
            } else {
                LOGGER.error( " [ FAIL ] Couldn't find the service url for account " +
                                   email, getCurrentMethodName() );
            }
            return response;
        }

        public String findEmailClickthroughLink(Response response, String htmlPatternToParse, String plaintextPatternToParse) throws Exception {
            String body = response.getBody().asString();
            findTextInEmail(body, plaintextPatternToParse, "PLAINTEXT");
            return findTextInEmail( body, htmlPatternToParse, "HTML" );
        }

        private String findTextInEmail(String sourceText, String regex, String emailType ) throws Exception{
            String targetText = "";

            Pattern pattern =  Pattern.compile( regex );
            Matcher matcher = pattern.matcher( sourceText.replace("\\", "") );
            if(matcher.matches()) {
                targetText = matcher.group(1)
                                    .replace("=", "=");
                LOGGER.info( " [ PASS ] Found a link from " + emailType + " email " +
                                   targetText, getCurrentMethodName() );
            }
            return targetText;
        }
    }
}

In the future you may modify your Google Apps Script by publishing a new version (or overwriting the existing one). Depending on the change, this modified “contract” of your proxy may also need to be updated correspondingly with the client application. With this in mind, you now have the power to use Google Apps Script to verify emails.

 

NOTE: In case you need extra configuration around security, you can take the more configurable approach by using the Gmail API directly.

Testing using Postman

Postman is a very handy tool for sending requests (which are mock-able) during development and while testing. This “post” will address some common ways Postman can be utilized in a testing effort.

1. Manual Testing

When you need to execute a specific request to a server postman allows you to send that directly. NOTE: If you are jumping back and forward from the browser and Postman (very common), you will want to sync your browser cookies with Postman via the Interceptor to share access to the session — this is a big time saver.

2. Automated BE Smoke Testing

For very common user scenarios, more often than not, you can automate  testing by sending critical requests necessary to mimic a users experience. For example, a user logs in, searches for a product, adds 2 then removes 1, submits their order, confirm their order history. Due to the stable nature of backend tests, this type of testing is recommended to have robust and as the core for Functional testing. Using Postman is faster than creating a custom test framework and it is intuitive to share the Postman collection tests with other members of your team (no documentation necessary 😉 ).

3. Performance Testing

If you have built out multiple user workflows in your Postman collection(s), you can utilize them by creating parameterized iterations with your CSV (more on that below). In order to see system thresholds, you can scale up the iterations (delays), or even run multiple collections simultaneously while monitoring your system. Admittedly, there are better tools to suit this purpose.

4. Bootstrapping

Very often there is a need to create data necessary for other services to work properly. Running a collection will allow all the requests to fire in sequence to perform the procedure you need. Often this is used for setting up a system or creating dummy data.

Making the most out of it

In order to fully maximize the effectiveness of Postman, be sure to take advantage of Pre-request script conditions, as well as the post-request Tests. With these you can manipulate variables stored between requests, as well as make assertions on the state of the request. Postman uses Javascript to run.

Next, there may be a set of data you would like to parameterize your requests with — this is done by binding that data to a variable ( {{variable}} ).

Taking variable binding another step further, you can pass in a CSV data set (a “table” with headers of variable names) to allow auto decoupling of the postman collection with the data it will use. This method of using CSV data set, will allow your collection to run N number of times (iterations) for the number of rows of data you have in your data set. When running (Postman Runner), you can specify a delay between iterations if necessary.

Finally, you can execute your postman collection(s) in your CI/CD system by way of Node.js and the Newman package.

Conclusion

Although postman is not as flexible as codifying your own solution for testing (not possible to run BE + FE hybrid tests, 3rd party library integration not supported, varying subscription plan restrictions, etc.) it certainly is a staple in any developing, and testing initiative.

How to enter the Tech industry as a Software Developer/Engineer

According to various sources, it is well known that one of the fastest growing professions in the nation is a Software Developer/Engineer. The nationwide  median for this progression is an annual $69K – $80K in 2019, compared to the estimated national median for all professions at $46.8K. This profession is thought to be highly desirable not only due to the monetary aspect, but also because:

  • The ease of finding similar job offerings
  • Health Benefits
  • Work schedule flexibility (even remote work days)
  • Promotion potential
  • Degree of worker satisfaction
  • Casual dress code

Obviously, entering this profession won’t resolve all of life’s problems. Some common complaints of this profession may include:

  • Long work hours for some projects
  • Fast pace
  • Lack of diversity

With that in mind, I would like to discuss a various approaches on how to enter the industry as a Software Developer.

1 – Go to college/university

This is a common approach for the many people who are young and fresh out of high school. Modern universities and colleges have various courses which are geared to prepare graduates with skills for entering the Software profession. For a more universally timeless approach it makes sense to choose Computer Science as a major. For the business savvy, or those more contextual minded, Information Technology is another option. Assuming you graduate, going to school practically guarantees you an entry level Software Developer job when you begin job searching — even better if you have a portfolio.

The obvious downside to going to college/university is the cost & time that you will have spent. It’s not rare to graduate with some debt from loans. It is also not uncommon to hear how unpractical the knowledge gained was upon entering the real world. For this reason, if you decide to go to school for 4 years, I’d advice taking per-requisite courses in a community/technical college before transferring your last 2 years to take the more focused major-relevant courses. With this, you save more money on tuition costs and have more wiggle room, if you decide to switch majors.

2 – Enroll in coding Boot-camp

This is a middle ground alternative I’d recommend for those with a specific idea on which area of expertise they would like to focus and/or, those who want to switch their current career to something in Tech but need a structured regimen to do so. In 2019, this is a popular approach considering the ocean of material one can find online for any development stack. Switchup.org has a really nice list of some coding boot-camps out there (i.e. App Academy, Flatiron School, Coding Dojo, etc).

Most of all these schools have a guaranteed job-placement post completion and some even defer tuition until you are hired. Some downsides of taking this approach is the cost (~$9K – ~20K), focus on specific stack technologies, and the intensiveness — it’s called boot-camp for a reason.  Overall it seems like a reasonable investment if you have the time to put in.

3 – Save enough money, quit your current position, and put aside a few months to build your portfolio

Made possible due to the plethora of information online today but a very arguable approach due to the risk level. But where there are risks, there are rewards, and this is worth mentioning. If you understand what a Software Developer position requires (certain skill-set around a stack (Front-end, Back-end, Devops, etc), ability to answer common interview questions (algorithm,s live coding, what would you do),  good communication, and possession of a curious and/or team-player state of mind), working up to your first job will be a matter of setting your goal, defining tasks, and spending the time to become well-versed in each of the key areas.

Taking this route will require lots of discipline and also access to the materials that will get you technically proficient. For example, if you would like to get a formal understanding of computer science concepts, you can enroll in an open university like Open or Edx, among many others.

Or maybe you need to understand the big picture but also have access to video walkthroughs of how to use a technology. That’s were sites like Pluralsight, Udemy, or Egghead come in. Reading official documentation is always the first recommendation though 😉

Once you have gained an understanding around how to use and develop, you should build your portfolio by working on a project, even better with if it’s on a team. You should at least have a github presences.

At this stage I would highly recommending learning a cloud platforms such as AWS, Azure, or GCP, if you have not. Many companies, large  and start-up are utilizing the cloud so it is good to play with and understand key services (storage, instance provisioning, architecture as code, lambda, etc). Getting AWS developer certified, for instance, will cost $150 but will help show prospective companies you understand the fundamentals.

Once you have all this under your belt, now is a good time to start taking practice interviews and hunting! Update your Linked profile, and head out to the job boards. At this point it will take time, but be patient and consistent in applying at the jobs that you are passionate about and highly desire. Be confident in your abilities because learning this much to switch careers means you are proactive and disciplined and highly adaptable, all common traits needed to become a successful Software Developer/Engineer.

Best of luck!

Salesforce – Unit test generator for profile field accessibility verification by XML

This brief post is a continuation of the prior one and discusses the possibility of generating tests from profile XML(s), rather than using SOQL. Note: This is basically a proof of concept that relies on reading a directory of profile xml files, then parsing the field accessibility values  based on a target Object. The fields gathered from the profile xml, are not exhaustive and thus may not result in passing tests. See the last post for a more accurate solution.

This script “generateProfileUnitTests.py” was created in Python 2.7, it will generate a sample Salesforce unit test named “generateProfileUnitTests.cls

#!/bin/python

"""
python generateProfileUnitTests.py -o 'Contact' -d 'C:\Salesforce\profiles'
"""

import sys
import xml.etree.ElementTree as ET
import argparse
from os import listdir
from os.path import isfile, join
import re

parser = argparse.ArgumentParser()                                               

parser.add_argument("--sobject", "-o", type=str, required=True)
parser.add_argument("--profiledirectorypath", "-d", type=str, required=True)
args = parser.parse_args()

sobject = args.sobject

filetemplatePre = """
@isTest
public class ContactObjectTest {{

    static String writeFieldName = 'PermissionsEdit';

    /**
    object = Contact
    profile = System Administrator
    **/
    private static void runProfileTest(String objectName, String profile, Map<String, Map<String, Boolean>> expectedPerms) {{
        Boolean success = true;
        try 
        {{
            List perms = [SELECT Id, Field, SObjectType, PermissionsRead, PermissionsEdit 
                FROM fieldPermissions 
                WHERE SObjectType = :objectName 
                AND parentId in ( SELECT id 
                    FROM permissionSet 
                    WHERE PermissionSet.Profile.Name = :profile)];
            
            Set nonExpectedFieldsFound = new Set();
            // Go through actual perms and make sure they exist if expected
            for(FieldPermissions perm  : perms) {{
                try {{
                    Map<String, Boolean> expectedPerm = expectedPerms.get(perm.Field);
                    System.assertEquals(expectedPerm.get(writeFieldName), perm.PermissionsEdit,
                        'Permission named ' + perm.Field + ' is ' + perm.PermissionsEdit + ' but expected ' + expectedPerm.get(writeFieldName)
                    );
                    // Should also create a copy and remove (to assert exact fields?)
                }} catch (NullPointerException e) {{
                    nonExpectedFieldsFound.add(perm.Field);
                    // Error is 'Attempt to de-reference a null object'
                    System.debug('Found a field that was not in expected permissions: ' + perm.Field);
                    success = false;
                }}
            }}
            System.assertEquals(0, nonExpectedFieldsFound.size(), 'Found Read only fields in ' + objectName + ' for ' + 
                'profile -- ' + profile + ' -- that were not in expected set: ' + nonExpectedFieldsFound);
        }} 
        catch (Exception e) 
        {{
            System.debug('Failed profile field test ' + e.getMessage());
            success = false;
        }} 
        finally 
        {{
	        System.assert(success);
        }}
    }}

	static Map<String, Boolean> createPerm(String writeName, Boolean value) {{
        Map<String, Boolean> perm = new Map<String, Boolean>();
        perm.put(writeName, value);
        return perm;
    }}

    /****************** PROFILE FIELD ACCESS TESTS *****************/
    {tests}
}}
"""

fileTemplateInsertTest = """
    static testMethod void test{sobject}ReadWriteFields{profileFormatted}Profile() {{
        runProfileTest('{sobject}', '{profile}', {expectedFieldsMethod}());
    }}

"""

fileTemplateInsertExpectedFeilds = """
    static Map<String, Map<String, Boolean>> get{sobject}{profileFormatted}Fields() {{
        Map<String, Map<String, Boolean>> {sobject}Fields = new Map<String, Map<String, Boolean>>();

        {insertExpectedFeild}

        return {sobject}Fields;
    }}

"""

fileTemplateInsertExpectedFeild = """
		{sobject}Fields.put('{fieldName}', createPerm(writeFieldName, {editFieldAccess}));"""

testFile = ''
tests = ''

for f in listdir(args.profiledirectorypath):
	if isfile(join(args.profiledirectorypath, f)):
		tree = ET.parse(join(args.profiledirectorypath, f))
		profileName = f.split('.')[0]

		expectedFeild=''

		for child in tree.getroot():
			if 'fieldPermissions' in child.tag:
				# Get field
				fieldName = child.find('{//soap.sforce.com/2006/04/metadata}field')
				if sobject + '.' in fieldName.text:
					editable = child.find('{//soap.sforce.com/2006/04/metadata}editable')
					# readable = child.find('{//soap.sforce.com/2006/04/metadata}readable')
					expectedFeild+=fileTemplateInsertExpectedFeild.format(sobject=sobject,
																			fieldName=fieldName.text,
																			editFieldAccess=editable.text)
		profileFormatted=re.sub('[^a-zA-Z]+', '', profileName) 
		insertExpectedFields=fileTemplateInsertExpectedFeilds.format(sobject=sobject, 
																profileFormatted=profileFormatted, 
																insertExpectedFeild=expectedFeild)

		insertTest=fileTemplateInsertTest.format(sobject=sobject,
												 profileFormatted=profileFormatted,
												 profile=profileName,
												 expectedFieldsMethod='get' + sobject + profileFormatted + 'Fields')
		tests+=insertExpectedFields
		tests+=insertTest

testFile = filetemplatePre.format(tests=tests)
f = open('generateProfileUnitTests.cls', 'w')
f.write(testFile)
f.close

 

Salesforce – Unit test generator for profile field accessibility verification

When testing Salesforce, there is often a desire to test the view(s) of a workflow as different users. A common strategy for this is to add automation on the UI, using a functional automation tool such as Selenium.

Depending on the number of profiles in your Salesforce organization, this is a very time consuming and brittle process — it entails running the same workflow for users of a unique profile, while checking both Read, and Write accessibility for many field elements (this is also dependent on the page layout).

Taking this route, we may run the risk of inverting our test pyramid. What we can do to remedy this issue is fairly simple since we know profile configuration is accessible from XML and also using SOQL to query object permissions. So this begs the question, “How can we structure a test to verify field permission accessibility for a given profile”?

1) Overall test case (visit this link to understand Salesforce unit testing)

@isTest
public class ContactObjectTest {

    static testMethod void testContactReadWriteFieldsSystemAdministratorProfile() {
        runProfileTest('Contact', 'System Administrator', getContactSystemAdministratorFields());
    }
}

2) Flesh out the generator

@isTest
public class ContactObjectTest {

    static String writeFieldName = 'PermissionsEdit';

    /**
    object = Contact
    profile = System Administrator
    **/
    private static void runProfileTest(String objectName, String profile, Map<String, Map<String, Boolean>> expectedPerms) {
        Boolean success = true;
        try 
        {
            List perms = [SELECT Id, Field, SObjectType, PermissionsRead, PermissionsEdit 
                FROM fieldPermissions 
                WHERE SObjectType = :objectName 
                AND parentId in ( SELECT id 
                    FROM permissionSet 
                    WHERE PermissionSet.Profile.Name = :profile)];
            
            Set nonExpectedFieldsFound = new Set();
            // Go through actual perms and make sure they exist if expected
            for(FieldPermissions perm  : perms) {
                try {
                    Map<String, Boolean> expectedPerm = expectedPerms.get(perm.Field);
                    System.assertEquals(expectedPerm.get(writeFieldName), perm.PermissionsEdit,
                        'Permission named ' + perm.Field + ' is ' + perm.PermissionsEdit + ' but expected ' + expectedPerm.get(writeFieldName)
                    );
                } catch (NullPointerException e) {
                    nonExpectedFieldsFound.add(perm.Field);
                    System.debug('Found a field that was not in expected permissions: ' + perm.Field);
                    success = false;
                }
            }
            System.assertEquals(0, nonExpectedFieldsFound.size(), 'Found Read only fields in ' + objectName + ' for ' + 
                'profile -- ' + profile + ' -- that were not in expected set: ' + nonExpectedFieldsFound);
        } 
        catch (Exception e) 
        {
            System.debug('Failed profile field test ' + e.getMessage());
            success = false;
        } 
        finally 
        {
            System.assert(success);
        }
    }
}

3) Add the test specific expected field accessibility map (createPerm, getContactSystemAdministratorFields methods)

@isTest
public class ContactObjectTest {

    static String writeFieldName = 'PermissionsEdit';

    /**
    object = Contact
    profile = System Administrator
    **/
    private static void runProfileTest(String objectName, String profile, Map<String, Map<String, Boolean>> expectedPerms) {
        Boolean success = true;
        try 
        {
            List perms = [SELECT Id, Field, SObjectType, PermissionsRead, PermissionsEdit 
                FROM fieldPermissions 
                WHERE SObjectType = :objectName 
                AND parentId in ( SELECT id 
                    FROM permissionSet 
                    WHERE PermissionSet.Profile.Name = :profile)];
            
            Set nonExpectedFieldsFound = new Set();
            // Go through actual perms and make sure they exist if expected
            for(FieldPermissions perm  : perms) {
                try {
                    Map<String, Boolean> expectedPerm = expectedPerms.get(perm.Field);
                    System.assertEquals(expectedPerm.get(writeFieldName), perm.PermissionsEdit,
                        'Permission named ' + perm.Field + ' is ' + perm.PermissionsEdit + ' but expected ' + expectedPerm.get(writeFieldName)
                    );
                } catch (NullPointerException e) {
                    nonExpectedFieldsFound.add(perm.Field);
                    System.debug('Found a field that was not in expected permissions: ' + perm.Field);
                    success = false;
                }
            }
            System.assertEquals(0, nonExpectedFieldsFound.size(), 'Found Read only fields in ' + objectName + ' for ' + 
                'profile -- ' + profile + ' -- that were not in expected set: ' + nonExpectedFieldsFound);
        } 
        catch (Exception e) 
        {
            System.debug('Failed profile field test ' + e.getMessage());
            success = false;
        } 
        finally 
        {
            System.assert(success);
        }
    }

    static Map<String, Boolean> createPerm(String writeName, Boolean value) {
        Map<String, Boolean> perm = new Map<String, Boolean>();
        perm.put(writeName, value);
        return perm;
    }

    /****************** PROFILE FIELD ACCESS TESTS *****************/
    
    static Map<String, Map<String, Boolean>> getContactSystemAdministratorFields() {
        Map<String, Map<String, Boolean>> ContactFields = new Map<String, Map<String, Boolean>>();

        
        ContactFields.put('Contact.Title', createPerm(writeFieldName, True));
        ContactFields.put('Contact.ReportsTo', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Phone', createPerm(writeFieldName, True));
        ContactFields.put('Contact.OtherPhone', createPerm(writeFieldName, True));
        ContactFields.put('Contact.OtherAddress', createPerm(writeFieldName, True));
        ContactFields.put('Contact.MobilePhone', createPerm(writeFieldName, True));
        ContactFields.put('Contact.MailingAddress', createPerm(writeFieldName, True));
        ContactFields.put('Contact.LeadSource', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Jigsaw', createPerm(writeFieldName, True));
        ContactFields.put('Contact.HomePhone', createPerm(writeFieldName, True));
        ContactFields.put('Contact.HasOptedOutOfFax', createPerm(writeFieldName, True));
        ContactFields.put('Contact.HasOptedOutOfEmail', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Fax', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Email', createPerm(writeFieldName, True));
        ContactFields.put('Contact.DoNotCall', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Description', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Department', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Birthdate', createPerm(writeFieldName, True));
        ContactFields.put('Contact.AssistantPhone', createPerm(writeFieldName, True));
        ContactFields.put('Contact.AssistantName', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Account', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Time_Zone__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Suffix__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Seasonal_Only__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Salutation__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.SMSEnabled__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Rehire_Location__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Rehire_Eligibility_Status__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Previously_Used_Full_Name__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Preferred_Phone_Number__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Portal_User__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Portal_User_Link__c', createPerm(writeFieldName, False));
        ContactFields.put('Contact.Override_Flag__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Mobile_Phone__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Mobile_Phone_Country_Code__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Middle_Name__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Last_Name__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Language__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Internal_External__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Internal_Email__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Internal_Candidate__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Home_Phone__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Home_Phone_Country_Code__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.First_Name__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.External_Email__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Employee_ID__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.EMPL_Rcd_No__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Current_Mailing_Adddress__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Country_Code_PS__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Contact_Profile_Submitted__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Candidate_ID__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Agency_Name__c', createPerm(writeFieldName, True));
        ContactFields.put('Contact.Address_Line_2__c', createPerm(writeFieldName, True));

        return ContactFields;
    }


    static testMethod void testContactReadWriteFieldsSystemAdministratorProfile() {
        runProfileTest('Contact', 'System Administrator', getContactSystemAdministratorFields());
    }
}

Now that we have an idea on how we’ve created our unit test to verify field permissions under the System Administrator profile, extending the test to add other profiles is as simple as adding the testMethod, along with getObjectProfileFields map. Since this is also reusable for objects (Account, Contact, etc) we can create a generator that cranks out tests for a given Object and the desired Profiles.

Here’s a script to do so in Python 3.7.2. There are 4 required parameters, access token, instance url, profiles (comma seperated), and object. You can save this as generateProfileUnitTestsFromSoql.py

#!/bin/python

"""
> python.exe generateProfileUnitTestsFromSoql.py 
-t '00D110000001O34!ARIAQLW5MsJqTVUbwgl13xDW_UGvZBG5GEJC.4bxsuzWc.ehOrnuRhT.MtMSrb0wCP07wfc71C6gEOnsSP0CknZnPdkzDUnc' 
-u '//customDomain.my.salesforce.com' 
-p 'System Administrator,Alternative-System Administrator,Standard User' 
-o Contact
"""

import argparse
import requests
import re

parser = argparse.ArgumentParser()                                               

parser.add_argument("--authtoken", "-t", type=str, required=True)
parser.add_argument("--instanceurl", "-u", type=str, required=True)
parser.add_argument("--sobject", "-o", type=str, required=True)
parser.add_argument("--profilenames", "-p", type=str, required=True)

args = parser.parse_args()

sobject = args.sobject
token = args.authtoken
instanceUrl = args.instanceurl
profileNames = args.profilenames.split(',')

filetemplatePre = """
@isTest
public class ContactObjectTest {{

    static String writeFieldName = 'PermissionsEdit';

    /**
    object = Contact
    profile = System Administrator
    **/
    private static void runProfileTest(String objectName, String profile, Map<String, Map<String, Boolean>> expectedPerms) {{
        Boolean success = true;
        try 
        {{
            List perms = [SELECT Id, Field, SObjectType, PermissionsRead, PermissionsEdit 
                FROM fieldPermissions 
                WHERE SObjectType = :objectName 
                AND parentId in ( SELECT id 
                    FROM permissionSet 
                    WHERE PermissionSet.Profile.Name = :profile)];
            
            Set nonExpectedFieldsFound = new Set();
            // Go through actual perms and make sure they exist if expected
            for(FieldPermissions perm  : perms) {{
                try {{
                    Map<String, Boolean> expectedPerm = expectedPerms.get(perm.Field);
                    System.assertEquals(expectedPerm.get(writeFieldName), perm.PermissionsEdit,
                        'Permission named ' + perm.Field + ' is ' + perm.PermissionsEdit + ' but expected ' + expectedPerm.get(writeFieldName)
                    );
                }} catch (NullPointerException e) {{
                    nonExpectedFieldsFound.add(perm.Field);
                    System.debug('Found a field that was not in expected permissions: ' + perm.Field);
                    success = false;
                }}
            }}
            System.assertEquals(0, nonExpectedFieldsFound.size(), 'Found Read only fields in ' + objectName + ' for ' + 
                'profile -- ' + profile + ' -- that were not in expected set: ' + nonExpectedFieldsFound);
        }} 
        catch (Exception e) 
        {{
            System.debug('Failed profile field test ' + e.getMessage());
            success = false;
        }} 
        finally 
        {{
            System.assert(success);
        }}
    }}

    static Map<String, Boolean> createPerm(String writeName, Boolean value) {{
        Map<String, Boolean> perm = new Map<String, Boolean>();
        perm.put(writeName, value);
        return perm;
    }}

    /****************** PROFILE FIELD ACCESS TESTS *****************/
    {tests}
}}
"""

fileTemplateInsertTest = """
    static testMethod void test{sobject}ReadWriteFields{profileFormatted}Profile() {{
        runProfileTest('{sobject}', '{profile}', {expectedFieldsMethod}());
    }}

"""

fileTemplateInsertExpectedFeilds = """
    static Map<String, Map<String, Boolean>> get{sobject}{profileFormatted}Fields() {{
        Map<String, Map<String, Boolean>> {sobject}Fields = new Map<String, Map<String, Boolean>>();

        {insertExpectedFeild}

        return {sobject}Fields;
    }}

"""

fileTemplateInsertExpectedFeild = """
        {sobject}Fields.put('{fieldName}', createPerm(writeFieldName, {editFieldAccess}));"""

testFile = ''
tests = ''

for profileName in profileNames:
    response = requests.get(instanceUrl + "/services/data/v44.0/query?q=" 
                "SELECT Id, Field, SObjectType, PermissionsRead, PermissionsEdit FROM fieldPermissions "
                "WHERE SObjectType = '" + sobject + "' AND parentId in " 
                "( SELECT id FROM permissionSet WHERE PermissionSet.Profile.Name = '" + profileName + "')", 
                headers={'Authorization': 'Bearer ' + token})

    expectedFeild=''

    for record in response.json()['records']:
        # Get field
        fieldName = record['Field']
        editable = record['PermissionsEdit']
        # readable = record['PermissionsRead']

        expectedFeild+=fileTemplateInsertExpectedFeild.format(sobject=sobject,
                                                                fieldName=fieldName,
                                                                editFieldAccess=editable)
    profileFormatted=re.sub('[^a-zA-Z]+', '', profileName) 
    insertExpectedFields=fileTemplateInsertExpectedFeilds.format(sobject=sobject, 
                                                            profileFormatted=profileFormatted, 
                                                            insertExpectedFeild=expectedFeild)

    insertTest=fileTemplateInsertTest.format(sobject=sobject,
                                             profileFormatted=profileFormatted,
                                             profile=profileName,
                                             expectedFieldsMethod='get' + sobject + profileFormatted + 'Fields')
    tests+=insertExpectedFields
    tests+=insertTest

testFile = filetemplatePre.format(tests=tests)
f = open('generateProfileUnitTests.cls', 'w')
f.write(testFile)
f.close

 

Mission critical Git commands

Git has become a staple tool for managing version control in modern development. Being a very powerful tool, the capabilities it provides are extensive. But for everyday purposes, you may revisit a core set of commands over, and over, and over, and over again. Below are some very frequented commands that should be known to all git users.

COMMON (Daily)

  • git pull — Pull the latest head from remote, merging into the head of local branch
    • git pull
  • git add — Stage commits that have been modified, newly added, or deleted
    • git add -u
  • git commit — Create a new commit out of the staged files
    • git commit -m “doc – My message commit”
  • git diff — Discover the difference between a commit (default latest) and unstaged changes
    • git diff fb1b7
  • git log — Review the log of commits
    • git log
  • git checkout — Often for checkout of branches but also restoring files
    • git checkout -b feature-myNewBranch
  • git reset — Reverts the HEAD to another commit
    • git reset –hard abd43
  • git rebase — Squashing commits, for log hygiene
    • git rebase -i HEAD~3
  • git push –– Push the HEAD of current branch to a given remote
    • git push origin

UNCOMMON (Every now and then)

  • git fetch — Retrieve the latest head from origin
  • git init –– Create a new git repository (for one that does not have a remote)
  • git clone {url}– Clone a git repository from an existing one (on some remote)
  • git tag -a myNewTag -m “Tag description” — Create a new tag

In addition to these commands, some users may prefer to use a GUI tool to provide an friendly interface to perform Git tasks. Tools like Git Kraken, or Source Tree.

The argument for Behavior-Driven Development (BDD)

In most development efforts, the features and capabilities defined will come from the stakeholder(s) who are sponsoring the development — if not yourself. The stakeholders will interact with someone technically proficient, who will be the decider of what gets scheduled, prioritized, designed, and implemented. Often this person takes the role of a Product Owner/Manager. 

Since one of the main focuses in a Product Owner’s day-to-day is understanding requirements and making sure features are delivered as expected — bug free — they will often need a window to judge if this is the case. One way this can be done is writing up a Specification document for a feature and working with a test team to ensure a Test Plan with appropriate Test Cases are created. At that point the Product Owner can decide if sufficient cases and coverage have been accounted for.

This approach is nice if resources have been dedicated to ensuring each test case gets run and can report on the statuses, both while developing a new feature, and during regression. In theory it is nice but not easy to keep in lockstep.

Behavior-Driven Development (BDD) to the rescue!

An alternative approach that also allows Product Owners to understand the extent of feature developed, and their effort to synchronize with a test team is to utilize BDD (A form of TDD that focuses on UAT (User acceptance testing)). In this context, User Stories  are created by the Product Owner by way of working with stakeholders and put into a backlog of stories. Each, scenario describes a specific thing that a user would do. These would be descriptive and accurate. For example in a Auction Website platform you may have a scenario:

As an existing member to the U-Sell-It Auction Site
when I bid on a product that already has bids
and my bid is lower than prior bids,
then a message is displayed that my bid amount is too low

As you can see, this is very clear to a Product Owner, and nearly anyone who is looking at the scenario! Also, when products features / capabilities are framed in this way, it helps to identify potential gaps.

How will test use this?

In a testing effort, many BDD frameworks can be used such as Cucumber, and JBehave. At first, adding another DSL to a project may seem like overhead — especially if testing is performed by a team with “test” expertise — and that is a common complaint among many would-be-adopters of using a BDD framework. Though, the value gained from using it outweighs the value of not…

Benefits:

  • “Transparency” in the tests created. Tests can be shared with anyone and are highly understandable (it is still possible to map the DSL to badly designed & inaccurate code…)
  • Logs. Much more readable and can be easy to identify the step of failure.
  • Separation of concerns. Anyone can create new tests, edit existing ones at a high level.
  • May help drive better design decisions when creating the code used by the DSL (making generic & more reusable).

Drawbacks:

  • BDD framework needs to be learned.
  • Custom code may be required to interact with the DSL (previously available out-of-the-box when used without the framework).
  • Another dependency.
  • New layer to the existing test project + more code.

For these reasons among others, a BDD framework can be a very effective component to add in your development life-cycle.

Initializing DB data in Spring Boot for different environments

In many scenarios, it is desirable to create seeded data for an environment prior to usage. In this post, we will discuss how to initialize (running startup scripts) a NoSQL DB in different environments with seeded data, by the facilities provided in Spring Boot.

From Spring Initializr create a project with at least “Web” and “Cassandra” dependencies. Your pom.xml will look something like

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="//maven.apache.org/POM/4.0.0" xmlns:xsi="//www.w3.org/2001/XMLSchema-instance"
   xsi:schemaLocation="//maven.apache.org/POM/4.0.0 //maven.apache.org/xsd/maven-4.0.0.xsd">
   <modelVersion>4.0.0</modelVersion>
   <parent>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>2.1.1.RELEASE</version>
      <relativePath/> <!-- lookup parent from repository -->
   </parent>
   <groupId>com.blog</groupId>
   <artifactId>musicstore</artifactId>
   <version>0.0.1-SNAPSHOT</version>
   <name>musicstore</name>
   <description>Demo project for Spring Boot</description>

   <properties>
      <java.version>1.8</java.version>
   </properties>

   <dependencies>
      <dependency>
         <groupId>org.springframework.boot</groupId>
         <artifactId>spring-boot-starter-actuator</artifactId>
      </dependency>
      <dependency>
         <groupId>org.springframework.boot</groupId>
         <artifactId>spring-boot-starter-data-cassandra</artifactId>
      </dependency>
      <dependency>
         <groupId>org.springframework.boot</groupId>
         <artifactId>spring-boot-starter-web</artifactId>
      </dependency>

      <dependency>
         <groupId>org.springframework.boot</groupId>
         <artifactId>spring-boot-starter-test</artifactId>
         <scope>test</scope>
      </dependency>
   </dependencies>

   <build>
      <plugins>
         <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
         </plugin>
      </plugins>
   </build>

</project>

Once you have your project template, begin to create the web services, models, and configuration for the application. The project layout may look something like below:

Spring boot project structure

Assuming your NoSQL database is up, you can test out the application by adding the Spring DB configuration in the application.properties file (see a full list here) and running the app. *Note* At this point there is no need for adding a configuration class and creating bean(s) for the NoSQL db — Spring boot uses the key-value pairs in the configuration to “auto” create the DB bean(s).
Here’s the output from running:

2019-01-23 19:15:58.816 ERROR 17905 --- [           main] o.s.boot.SpringApplication               : Application run failed

org.springframework.context.ApplicationContextException: Unable to start web server; nested exception is org.springframework.boot.web.server.WebServerException: Unable to start embedded Tomcat
....
....
Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: Keyspace 'music_store' does not exist

This exception is a Cassandra DB specific error relating to … the  keyspace not existing! By default, our keyspace (think of this as our DB) can not be created using the convenient Spring Cassandra properties (in our application.properties). We will now need to create our Cassandra DB beans manually and tell it to create a keyspace if it does not exist.
That configuration file looks like this

package com.blog.musicstore.configuration;

import com.blog.musicstore.MusicStoreApplication;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.cassandra.config.AbstractCassandraConfiguration;
import org.springframework.data.cassandra.config.CassandraCqlClusterFactoryBean;
import org.springframework.data.cassandra.config.SchemaAction;
import org.springframework.data.cassandra.core.cql.keyspace.CreateKeyspaceSpecification;
import org.springframework.data.cassandra.core.cql.keyspace.KeyspaceOption;
import org.springframework.data.cassandra.repository.config.EnableCassandraRepositories;

import java.util.*;

@Configuration
@EnableCassandraRepositories(basePackages = "com.blog.musicstore")
public class DatabaseConfig extends AbstractCassandraConfiguration {

    @Value("${spring.data.cassandra.keyspace-name}")
    private String keyspaceName;

    @Value("${spring.data.cassandra.port}")
    private Integer port;

    @Value("${spring.data.cassandra.contact-points}")
    private String contactPoints;

    @Bean
    @Override
    public CassandraCqlClusterFactoryBean cluster() {
        CassandraCqlClusterFactoryBean bean = new CassandraCqlClusterFactoryBean();
        bean.setKeyspaceCreations(getKeyspaceCreations());
        bean.setContactPoints(contactPoints);
        bean.setPort(port);
        bean.setJmxReportingEnabled(false);
        return bean;
    }

    @Override
    protected String getKeyspaceName() {
        return keyspaceName;
    }

    @Override
    public String[] getEntityBasePackages() {
        return new String[]{MusicStoreApplication.class.getPackage().getName()};
    }

    @Override
    protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
        return Collections.singletonList(CreateKeyspaceSpecification.createKeyspace(getKeyspaceName())
                .ifNotExists()
                .with(KeyspaceOption.DURABLE_WRITES, true)
                .withSimpleReplication());
    }

    @Override
    public SchemaAction getSchemaAction() {
        return SchemaAction.CREATE_IF_NOT_EXISTS;
    }
}

Compiling and running the app again, we no longer see errors. The Keyspace was also created and so were the tables (Model repository classes should extend CassandraRepository and have @Repository annotation). That’s great!

Keyspace created with tables

Adding data for different environments

Spring uses Profiles to allow runtime differentiation of the same compiled bytecode. We will use it to perform specific db seeding prior to running the application. This can be used to ensure different environment(s) have different setup data prior (often for testing).
In Spring, it’s very easy. Just add a property file for each environment/profile.

application.properties

spring.data.cassandra.keyspace-name=music_store
spring.data.cassandra.contact-points=localhost
spring.data.cassandra.port=9042

application-dev.properties

spring.data.cassandra.keyspace-name=musicstore_dev
spring.data.cassandra.contact-points=localhost
spring.data.cassandra.port=9042
not.a.real.property.dev.specific.stuff=blah

application-qa.properties

spring.data.cassandra.keyspace-name=musicstore_qa
spring.data.cassandra.contact-points=localhost
spring.data.cassandra.port=9042
not.a.real.property.qa.specific.stuff=blah

Now when we run our application make sure to set the active profile to the correct target — Specify via env variable, JVM system properties, context param on web app (xml), etc.

-Dspring.profiles.active=dev

You should now see the new Keyspace set from the profile properties, created in Cassandra. But there is no data in it…

To add data in the DB, it would be nice if our there is a spring property to set in the properties file that specifies a script to run after keyspace creation. This capability is provided if using JPA or Hibernate. If we look into the “AbstractCassandraConfiguration” interface (used for creating our Database beans)  there is a method specified for adding data and tearing down:

getStartupScripts

and

getShutdownScripts

respectively. So let’s set up a startup script to load our SQL for each env. (Note: It would be better to use an ORM library to interface with our DB, rather than using SQL or CQL).

SQL startup seed script

DatabaseConfig.java

package com.blog.musicstore.configuration;

import com.blog.musicstore.MusicStoreApplication;
import io.micrometer.core.instrument.util.IOUtils;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Profile;
import org.springframework.core.io.Resource;
import org.springframework.data.cassandra.config.AbstractCassandraConfiguration;
import org.springframework.data.cassandra.config.CassandraCqlClusterFactoryBean;
import org.springframework.data.cassandra.config.SchemaAction;
import org.springframework.data.cassandra.core.cql.keyspace.CreateKeyspaceSpecification;
import org.springframework.data.cassandra.core.cql.keyspace.KeyspaceOption;
import org.springframework.data.cassandra.repository.config.EnableCassandraRepositories;

import java.io.IOException;
import java.io.InputStream;
import java.util.*;

@Profile({"qa", "dev"})
@Configuration
@EnableCassandraRepositories(basePackages = "com.blog.musicstore")
public class DatabaseConfig extends AbstractCassandraConfiguration {

    @Value("${spring.data.cassandra.keyspace-name}")
    private String keyspaceName;

    @Value("${spring.data.cassandra.port}")
    private Integer port;

    @Value("${spring.data.cassandra.contact-points}")
    private String contactPoints;

    @Value("classpath:env/sql/seed-#{environment.getActiveProfiles()[0]}.sql")
    private Resource sqlImport;

    //@Lazy
    //@Autowired
    //CassandraAdminTemplate template;

    @Bean
    @Override
    public CassandraCqlClusterFactoryBean cluster() {
        CassandraCqlClusterFactoryBean bean = new CassandraCqlClusterFactoryBean();
        bean.setKeyspaceCreations(getKeyspaceCreations());
        bean.setContactPoints(contactPoints);
        bean.setPort(port);
        bean.setJmxReportingEnabled(false);
        return bean;
    }

    @Override
    protected String getKeyspaceName() {
        return keyspaceName;
    }

    @Override
    public String[] getEntityBasePackages() {
        return new String[]{MusicStoreApplication.class.getPackage().getName()};
    }

    @Override
    protected List<CreateKeyspaceSpecification> getKeyspaceCreations() {
        return Collections.singletonList(CreateKeyspaceSpecification.createKeyspace(getKeyspaceName())
                .ifNotExists()
                .with(KeyspaceOption.DURABLE_WRITES, true)
                .withSimpleReplication());
    }

    @Override
    protected List<String> getStartupScripts() {
        List<String> scripts = new ArrayList<>();

        try(InputStream is = sqlImport.getInputStream()) {
            String string =  IOUtils.toString(is);
            scripts = Arrays.asList(string.split(";"));
        } catch (IOException e) {
            e.printStackTrace();
        }
        /* Doesn't work
        // 1) no valid reference to cassandraTemplate (Not initialized)
        // 2) can't extract sql from DAO
        try {
            ProductOrder productOrder = new ProductOrder("1","2", "3", new Date());
            template.insert(productOrder);

        } catch (Exception e) {
            e.printStackTrace();
        }
        return new ArrayList<>();
        */

        return scripts;
    }

    @Override
    public SchemaAction getSchemaAction() {
        return SchemaAction.CREATE_IF_NOT_EXISTS;
    }
}

Now with, the updated config, we should see data in our environment created on execution of our application.

Data seeded in environment

And there we have it, a seeded DB for our target environment/profile (Note: Don’t forget to also add scripts for shutdown of the application).

Setting up React Jest and Enzyme

Jest is a popular JS framework for testing React js applications. By itself, it may need additional functionality to test React capabilities. Enzyme is a tool used to facilitate component testing.

In this quick overview, we will setup a React 16 CRA application with Jest & Enzyme using NPM.

Assuming we have our application created from

npx create-react-app my-app

Hop into our app directory “my-app” and notice the “node_modules” folder. By itself, there are many existing capabilities that our provided out of the box, one of which being jest

Node_modules containing jest

 

 

 

 

 

 

So according to the Jest documentation for getting started, we can specify a “test” command for NPM to run jest. By default test will look for in a few places  one of them being the existing “App.test.js” file created from CRA. Lets edit it with a pure Jest test and run “npm test”.

package.json

{
  "name": "test-app",
  "version": "0.1.0",
  "private": true,
  "dependencies": {
    "@material-ui/core": "^3.8.3",
    "react": "^16.7.0",
    "react-dom": "^16.7.0",
    "react-router-dom": "^4.3.1",
    "react-scripts": "2.1.3"
  },
  "scripts": {
    "start": "react-scripts start",
    "build": "react-scripts build",
    "test": "jest",
    "eject": "react-scripts eject"
  },
  "eslintConfig": {
    "extends": "react-app"
  },
  "babel"
  "browserslist": [
    ">0.2%",
    "not dead",
    "not ie <= 11",
    "not op_mini all"
  ]
}

App.test.js

import React from 'react';

test('renders without crashing', () => {
  expect(1).toBe(1);
});

Running “npm test” we get the error

({"Object.<anonymous>":function(module,exports,require,__dirname,__filename,global,jest){import React from 'react';
^^^^^

SyntaxError: Unexpected identifier

Based on the error, (first line of our test file) it looks like the compiler is complaining about the React and ES6 syntax. Reading more in Jest, getting started, it states that we have should also specify a .babelrc file in order to use ES6 and react features in  Jest.  Lets go ahead and add that file in the CRA root.

.babelrc

{
"presets": ["@babel/env", "@babel/react"]
}

Now if we rerun the test again “npm test” we should see a pass. Great!

Passing Jest test

The test we have at the moment does not test anything specific to React, i.e. component rendering. To get this working we want to use Enzyme — It is not provided from CRA. Here’s the guide.

$ npm install --save-dev enzyme enzyme-adapter-react-16

We also need to add a setup file before using Enzymes features. The following helper file is added in the CRA root folder

enzyme.js

// setup file
import Enzyme, { configure, shallow, mount, render } from 'enzyme';
import Adapter from 'enzyme-adapter-react-16';

configure({ adapter: new Adapter() });
export { shallow, mount, render };
export default Enzyme;

Back in our App.test.js we can edit the test to check if our js component will render.

import React from 'react';
import App from './App';
import { shallow } from './enzyme';

test('renders without crashing', () => {
	const app = shallow(<App/>);
	expect(app.containsAnyMatchingElements([<a>
        Learn React
      </a>
    ])
  ).toBe(true);
});

Here’s what we see.

Jest encountered an unexpected token

The error is from an import inside our test component “App.js” stating ” Jest encountered an unexpected token”. From the output, it looks like the svg file is being parsed incorrectly. This error has been noted elsewhere.

We start by adding the assetTransformer.js file in the root

assetTransformer.js

const path = require('path');

module.exports = {
  process(src, filename, config, options) {
    return 'module.exports = ' + JSON.stringify(path.basename(filename)) + ';';
  },
};

And allow Jest to perform this transformation on assets during module mapping. This is done by adding an attribute in the “jest” property of package.json

{
  "name": "test-app",
  "version": "0.1.0",
  "private": true,
  "dependencies": {
    "@material-ui/core": "^3.8.3",
    "react": "^16.7.0",
    "react-dom": "^16.7.0",
    "react-router-dom": "^4.3.1",
    "react-scripts": "2.1.3"
  },
  "scripts": {
    "start": "react-scripts start",
    "build": "react-scripts build",
    "test": "jest",
    "eject": "react-scripts eject"
  },
  "eslintConfig": {
    "extends": "react-app"
  },
  "browserslist": [
    ">0.2%",
    "not dead",
    "not ie <= 11",
    "not op_mini all"
  ],
  "jest": { 
    "moduleNameMapper": { 
      "\\.(jpg|jpeg|png|gif|eot|otf|webp|svg|ttf|woff|woff2|mp4|webm|wav|mp3|m4a|aac|oga)$": "<rootDir>/assetTransformer.js", 
      "\\.(css|less)$": "<rootDir>/assetTransformer.js" 
    } 
  },
  "devDependencies": {
    "enzyme": "^3.8.0",
    "enzyme-adapter-react-16": "^1.7.1"
  }
}

Here’s what we get from “npm test” this time:

Passing Jest + Enzyme test

Perfect!