Planet Drupal

In this blog post, we’ll take a look at what a typical month looks like for a developer working at Agiledrop: their daily and weekly tasks, monthly activities and events, and the general office atmosphere.

READ MORE

Given the facts and figures in this study, there are more than 58 percent of people who prefer their smartphones over desktop or laptop to browse information on the internet. And when those responsible for the development (at the backend) decide to go ahead without any changes for the mobile, the users start getting annoyed.

GatsbyJS + Drupal: Create Content Type Landing Pages jflynn Sun, 12/01/2019 - 16:30

At NEDCamp I had the pleasure of seeing many wonderful people and sessions, but, as you may know, my favorite track at most events is the "Hallway Track". If you're not familiar, the Hallway Track is the time in-between sessions where some real magic happens. You have a chance to talk with some of the most amazing minds in the world about topics you're passionate about. You can share knowledge, bounce ideas, have epiphanies, and realize that your current problems (code-wise) are things that other people have run into. One such conversation happened that inspired me to think outside the box to solve a problem.

In my last post we went over how to create a related content section with references to entities that may have empty fields. Today, we're going to take that one step further and create landing pages for content types.

In Drupal, we would ordinarily build these by attaching a view to a content type based on contextual filters or similar in order to get a collection of all content of that content type. Since we don't have Views in GatsbyJS, we need another solution. There are a couple of options out there, including the recently released JSON:API Cross Bundles module, compliments of Centarro and Matt Glaman. However, at the time of this writing there is an issue with the JSON:API Cross Bundles conflicting with JSON:API Extras. So, if you're relying on JSON:API Extras, you'll need another solution.

The problem:

Out of the box, JSON:API does not create a route to retrieve all nodes with any ease. However, there's no need to fear. This post is here!

Now if you're not using JSON:API Extras, I strongly recommend looking into JSON:API Cross Bundles. It creates a route to all content and will simplify your life. If you are using JSON:API Extras, have I got something for you.

Let's dive in.

Scenario:

You're building a decoupled Drupal site and you want to have a reusable template for landing pages that display all content of a content type. This is easy enough to do in Drupal using Views, but we lose Views when going decoupled so we need another way. How do we accomplish this in a decoupled application using GatsbyJS as the front-end?

Solution:

Strap in folks, this one gets a little bumpy.

Drupal side:

We need to do some work on both sides of the application for this to work. First, we will setup a content type in Drupal to use for our Content Landing Pages. This is kind of a choose your own adventure scenario, but one thing that you absolutely must have is an Entity Reference field that references the Config Entity: Content Type with a cardinality of 1. 

Select this field type:

 

 And this is the entity type to referenc

 

Now that we have our field created, select any content type that will require a content landing page as an available option.

Whatever else you want to put on this page is up to you. Go wild. Want a hero image? Add a hero image. Want a dancing baby gif? That's your choice and I respect it. Once you've finished making the greatest landing page content type ever™ we can move on to the fun part in GatsbyJS.

GatsbyJS side:

JSON:API gives us something that can help us out a bit. It goes by the name allNodeTypeNodeType and it will let us workaround the lack of a base all-content route. If we explore this in GraphiQL we'll see that we can drill down a bit and get exactly what we need.

{ allNodeTypeNodeType { nodes { relationships { node__article node__landing_page node__basic_page } } } }

NOTE: I didn't drill down too far, but all fields are available here.

Let's first create our landing page template in Gatsby.

First, let's just create a simple, empty component for our Landing Page with a basic graphql query.

// src/components/templates/LandingPage/index.js import React from 'react' import { graphql } from 'gatsby' function LandingPage({ data }) { return ( ) } export default LandingPage export const query = graphql` query($LandingPageID: String!){ nodeLandingPage(id: { eq: $LandingPageID }) { id title } } `

Nothing too fancy here, right? Just a template that we can use to build our pages out without Gatsby yelling at us on build.

Next, we're going to add this template to gatsby-node.js so that we create our pages dynamically.

// gatsby-node.js exports.createPages = async ({ graphql, actions }) => { const { createPage } = actions const LandingPageTemplate = require.resolve(`./src/components/templates/LandingPage/index.js`) const LandingPage = await graphql(`{ allNodeLandingPage { nodes { id path { alias } } } } `) LandingPage.data.allNodeLandingPage.nodes.map(node => { createPage({ path: node.path.alias component: LandingPageTemplate, context: { LandingPageID: node.id, } }) }) }

This is pretty straightforward so far, right? Let's think about what we're going to need in order for this to work the way we want it to and pull all of a single content type into our landing page.

We're going to need:

  • The content type we want to build a landing page for.
  • A query to fetch all content a content type.
  • The landing page template with logic to display the content type.
  • Probably some other things, but we'll sort that out along the way. We're in this together, remember?

How are we going to get these things? Let's go down the list.

The content type we want to build a landing page from:

We have this from our Drupal side. Remember, we created the Content Type field on our Landing Page content type? This can be placed in our gatsby-node.js and passed to our query via the context option.

Let's add it in.  First we need to update our graphql query to pull it in:

// gatsby-node.js const LandingPage = await graphql(`{ allNodeLandingPage { nodes { id relationships { // <---- add from this line field_content_type { drupal_internal__type name } } // <---- to this line } } } `)

What we're doing here is looking at GraphiQL and exploring our data to see what we have available. If we drill down into allNodeLandingPage.nodes we can see that in relationships we have field_content_type with some useful things. Specifically, our drupal_internal__type and name values. Also, notice that we removed nodes.path.alias from the query.

By adding these to our query we can now pass the info through to our created pages. We're going to do a bit of data manipulation here to create our paths dynamically as well. I follow the convention that a landing page's path should reflect the content type that it's a landing page for. So, if we were making a landing page for "Articles" the path would be path-to-my.site/articles and articles would have that as a base path to path-to-my.site/articles/my-awesome-article. However, you can follow whatever convention you see fit.

To do this, we're going to manipulate the name from the content type into a URL-friendly string by using the JavaScript .replace() function and then pass that to the path option. Since we also want to query for the content type on our landing page, we're going to pass the drupal_internal__type through the context option.

Let's do that:

// gatsby-node.js LandingPage.data.allNodeLandingPage.nodes.map(node => { const pathName = node.relationships.field_content_type.name.toLowerCase().replace(/ /g, '-') // <---- New line createPage({ path: pathName, // <--- Changed line component: LandingPageTemplate, context: { LandingPageID: node.id, ContentType: node.relationships.field_content_type.drupal_internal__type, // <---- New line } }) })

What does the the context option do? It passes data to our component as props. GraphQL already pulls the context data for the queries, which you can see in any query that has a variable for the filter. Usually this is the content ID so that it can build a page for a specific piece of content from Drupal, but we can leverage this to add more variables and more filtering however we see fit.

Our next step is going to be to actually USE this additional info to do something amazing with.

A query to fetch all content a content type:

Let's look back at our src/components/templates/LandingPage/index.js and see what we need to query. We know we want to get all nodes of a certain content type, and we know that we want to reuse this template for any landing pages with content listing. Since we've established that allNodeTypeNodeType gives us access to all content on available to Gatsby, let's query on that.

// src/components/templates/LandingPage/index.js export const query = graphql` query($LandingPageID: String!, $ContentType: String!){ nodeLandingPage(id: { eq: $LandingPageID }) { id title } allNodeTypeNodeType(filter: { drupal_internal__type: { eq: $ContentType }}) { // <---- New section nodes { relationships { node__article { id title path { alias } } node__page { id title path { alias } } } } } } `

What we're doing here is using that variable we passed via the context option in gatsby-node.js and filtering to only return the content type we're wanting to see. One 'gotcha' here is that this query will also return the landing page that references the content type. However, if you're not creating a landing page of landing pages then you should be alright.

Since we're only creating landing pages for two content types, this is fine, although we're not getting a lot back. Most projects that I've worked on have had some kind of "teaser" display for these kinds of pages. I'm not going to cover the specifics of creating a teaser template here, but the TL;DR is: start with your full display and take out everything but what you want on the teaser. For this post, we're going to create the list of links using the titles.

Now, if the content types that we're creating landing pages for don't have any content, then you're going to have a bad time. In this case, go back to my previous post about empty entity reference fields and see if you can use that to create some default fields and prevent errors or just create some content of the missing type.

Next, let's flesh out our landing page template a bit.

The landing page template with logic to display the content type:

So far, our template, minus the query, is pretty empty and not doing a lot. Let's add in the title of this landing page.

// src/components/templates/LandingPage/index.js function LandingPage({ data }) { const landingPage = data.nodeLandingPage return ( {landingPage.title} ) }

I like to clean up the variables a bit and rename data.nodeLandingPage to landingPage. It's a bit cleaner to me, but do what you want.

Alright, we have the title of this content, but what about the list of content we want to show on this page? Well, we're going to need to do some logic for that. First off, we need to know which content type we're looking for. Second, we need a way find it. Third, we need to clean this data into something usable. Finally, we need to display it.

We could just display everything returned from our allNodeTypeNodeType query, but there would be a lot of nulls and issues parsing the arrays. Here's an example of what that query returns before we massage the data, using the Drupal internal type article:

{ "data": { "allNodeTypeNodeType": { "nodes": [ { "drupal_internal__type": "article", "relationships": { "node__article": [ { "id": "0e68ac03-8ff2-54c1-9747-3082a565bba6", "title": "Article Template", "path": { "alias": "/article/article-template" } } ], "node__basic_page": null } } ] } } }

Now, to get the content this way we could do some complex mapping and sorting and filtering, but I tried that and it wasn't fun. Fortunately, Gatsby is here to rescue us and make life easier. Our context option gets passed into our page component as props. If you're unfamiliar with the concept of Props in React, and therefore Gatsby, props are properties that are passed into components. The line 

function LandingPage({ data }) {

could be rewritten as

function LandingPage(props) { const data = props.data

but we're using a concept called Destructuring to only pass in the prop that we need. This allows us to create variables from object keys without having to take the extra steps. Our page component props object also contains the key pageContext which is where anything in the context option gets stored to give the page template access to.

Let's bring that in:

// src/components/templates/LandingPage/index.js function LandingPage({ data, pageContext }) { const landingPage = data.nodeLandingPage const nodeType = data.allNodeTypeNodeType const contentType = 'node__' + pageContext.ContentType

Since we set our ContentType in gatsby-node.js we're able to use that here. Note that we're concatenating the string node__ with our pageContext.ContentType. We're doing this because everything in Gatsby is a node, including content types. This allows us to do the next steps.

Next, we want to clear out all of the non-content type data from the allNodeTypeNodeType query. This is what it looks like if we were to console.log(nodeType.nodes):

Array(1) 0: relationships: node__article: Array(1) 0: {id: "0e68ac03-8ff2-54c1-9747-3082a565bba6", title: "Article Template", path: {…}, …} length: 1 __proto__: Array(0) node__page: null

We only want the node__article array, so how do we get that? Well, we need to use .map() and a concept called currying. This is essentially creating a function that allows us to use a variable from outside of the .map() scope inside of the .map() callback. It allows us to break down a function into more functions so that we have more control over it, which is what we need here.

// src/components/templates/LandingPage/index.js function LandingPage({ data, pageContext }) { const landingPage = data.nodeLandingPage const nodeType = data.allNodeTypeNodeType const contentType = 'node__' + pageContext.ContentType const getContentArray = (contentType) => { // <---- Curry function, but not as delicious return (node) => (node.relationships[contentType]) } const contentArray = nodeType.nodes.map(getContentArray(contentType))

We created our curry function that takes our contentType as an argument. From within there, it completes the mapping and returns our desired array... almost.

Here's what we get back if we console.log(contentArray):

[Array(1)] 0: Array(1) 0: {id: "0e68ac03-8ff2-54c1-9747-3082a565bba6", id: 1, title: "Article Template", …} length: 1 __proto__: Array(0) length: 1 __proto__: Array(0)

We're almost there, but now we have an array of our content within another array. If only there were a function to help us out here...

Just kidding, there is! For this, we're going to use .flat(). The .flat() function flattens out a nested array into a single level. However, there's a gotcha with it, as mentioned in this Stack/Overflow question. We can get around this by using the gatsby-plugin-polyfill-io plugin.

Add gatsby-plugin-polyfill-io to your project by installing with yarn or npm

npm install gatsby-plugin-polyfill-io // or yarn add gatsby-plugin-polyfill-io

and in your gatsby-config.js file add the following within module.exports = {

plugins: [ { resolve: `gatsby-plugin-polyfill-io`, options: { features: [`Array.prototype.map`, `Array.prototype.flat`] }, }, ]

This will also create polyfills for the .map() function, which I use heavily.

So, let's flatten that array!

function LandingPage({ data, pageContext }) { const landingPage = data.nodeLandingPage const nodeType = data.allNodeTypeNodeType const contentType = 'node__' + pageContext.ContentType const getContentArray = (contentType) => { return (node) => (node.relationships[contentType]) } const contentArray = nodeType.nodes.map(getContentArray(contentType)) const contentArrayFlat = contentArray.flat()

And the resulting console.log(contentArrayFlat):

0: id: "0e68ac03-8ff2-54c1-9747-3082a565bba6" path: {alias: "/article/article-template"} title: "Article Template" length: 1 __proto__: Array(0)

Now we've got exactly what we wanted! The final step is to put this to work. We'll do that by creating a list of titles that link to the content. Your finished component should look like:

// src/components/templates/LandingPage/index.js import React from 'react' import { graphql, Link } from 'gatsby' // <--- added 'Link' here to use the link component function LandingPage({ data, pageContext }) { const landingPage = data.nodeLandingPage const nodeType = data.allNodeTypeNodeType const contentType = 'node__' + pageContext.ContentType const getContentArray = (contentType) => { return (node) => (node.relationships[contentType]) } const contentArray = nodeType.nodes.map(getContentArray(contentType)) const contentArrayFlat = contentArray.flat() return ( {landingPage.title}
    // One-liner to create the list of items. {contentArrayFlat.map((item, i) =>
  • {item.title}
  • )}
) } export default LandingPage export const query = graphql` query($LandingPageID: String!, $ContentType: String!){ nodeLandingPage(id: { eq: $LandingPageID }) { id title } allNodeTypeNodeType(filter: { drupal_internal__type: { eq: $ContentType }}) { nodes { relationships { node__article { id title path { alias } } node__page { id title path { alias } } } } } } `

And that's all there is to it. Hopefully you find this useful and it helps speed up your development with Gatsby a little bit.  If I missed anything on here, please don't hesitate to let me know in the comments. Always feel free to reach out to me on Twitter or Slack or any way you want to.

Credit where credit is due: Shane Thomas (AKA @codekarate) and Brian Perry (AKA @bricomedy) helped me work through this issue at NEDCamp.

Patron thanks:

Thank you to my Patreon Supporters

  • David Needham
  • Tara King
  • Lullabot

For helping make this post. If you'd like to help support me on Patreon, check out my page https://www.patreon.com/jddoesthings

Category Development Tags Drupal Planet GatsbyJS react Drupal Comments

The steps that any new launch or high traffic event should go through in order to have the best chance of success. This post is aimed at the project management level, so will try to stay out of the weeds, and focus on the high level topics you need to think about. There is a ~18 minute recording at the end of this post where I presented this topic at Drupalsouth 2019.

Title slide from the presentation.Preamble: What could be considered a high traffic event

Launching a brand new site

  • Re-platforming (e.g. moving CMS version or type, or between hosting providers)
  • eDM or other marketing event (e.g. Adwords)
  • Planned traffic event (e.g. black Friday)
  • Unplanned traffic event (e.g. news and media site)
Step 1) Ensure you have some basic Drupal configuration in place
  • Disable known problem child modules dblog, devel, statistics, radioactivity, page_cache
  • Enable dynamic_page_cache (if you have authenticated traffic)
  • Set minimum cache lifetime to something sensible
  • JS and CSS aggregation enabled
  • Automate these checks with Drutiny
Step 2) Content Delivery Network (CDN)

Additional insurance against a lot of traffic is distributing your cached content to all corners of the globe.

Tiered caching should be used to ensure the highest offload rate. Most CDN providers will support this at a given price point.

Step 3) Cache tuning and minimising origin requests

Every request that bypasses your CDN layer adds load to the platform. In order to have the best chance of surviving a high traffic event, origin traffic needs to be carefully considered and reduced where possible.

Requests to origin that are often overlooked

  • 404s
  • Marketing based parameters (e.g. utm_campaign)
  • Redirects (especially if re-platforming)
  • WAF to block silly requests (e.g. Wordpress URLs like wp-login.php)

It you are interested in WAF tuning, you should check out my talk last year on using Cloudflare to secure your Drupal site.

This has happened to a customer of mine in the past. Fun fact the gclid and dclid query parameters are guaranteed unique for every user and click. This effectively makes them un-cacheable.Step 4) Load testing

If you are building a new site, or are expecting a substantially different traffic profile than what you have currently, then you should look to load test the system.

  • Production hardware replica (scaled up if appropriate)
  • Emulate expected user behaviour, use existing analytics, or expected flows
  • Emulate what the browser would be doing (download all assets, including any HTTP 404s)
  • Ensure complex tasks are also simulated at the same time (e.g. editorial, searching, form submissions, feeds ingestions)

At the end of this task (they you may need to run several times), you should have the confidence that you can handle the traffic expected.

Step 5) Hardware (auto) scaling

Now that you have the hardware you need to have in place with load testing, ensure you have autoscaling in place to deal with the peaks and troughs (it is unlikely you need to run your peak hardware for the entire duration of the event).

Autoscaling can also help if the origin traffic that you experience is higher than anticipated.

Test the autoscaler, set limits that you are comfortable with, and ensure you know how quickly the new resources take to come to life.

Be nice to your OPs team, use an auto scaler.Step 6) Have a good fallback

Say the worst does happen, and you site does go down, or a critical API drops off the face of the internet, what does the end user see? Can you offer at least a better experience than a generic web server error page?

Most CDNs will have the ability to load balance origins (hot DR), and even fallback to a static version of the site if all origins are down.

It would make sense to test this prior to the high load event as well.

ABC's news website went down just before Drupalsouth 2019, and someone managed to screencap it, and then send it to me. I am confident that you can come up with a better fallback than this error page.Step 7) Warm your cache

If you have a rather long tail website, it will be worth warming your cache prior to the event. An excellent module called warmer has been written, to which allows warming all sorts of caches. It can for instance load every page in the XML sitemap. So this is fairly low effort, high reward.

Step 8) Third party API dependencies

This is more of a fundamental design decision likely made much earlier on in the project. Say the content of your page is dependent on the content in an API response.  If you request the API content during page generation time, then you are tying the speed and availability of your site to another site (often outside your control).

This can lead to slow page load times, and worse case scenario can tie up your server's resources.

New Relic APM has "external requests" to which allow you to visualise this.

There are ways to mitigate this:

  • Fetch the data in the background and cache locally in Drupal for as long as the data is considered 'good'. e.g. using Drush and a cronjob.
  • Use a client side application (e.g. React) and request the API response in the client side
  • Use a CDN on the API and see Step #6 above
Step 9) Realtime analytics

During the event, having access to realtime (or near realtime) analytics to find out

  • how the system is currently performing
  • requests/sec
  • where the traffic is coming from
  • cache offload rate from the CDN

Is extremely valuable. Even more valuable is being able to respond to this data in a quick and efficient matter. Having access to technical people can help. The types of logs and analytics you should be looking to get a hold of:

  • Web analytics tools (e.g. Google Analytics)
  • APM tools (e.g. New Relic)
  • CDN analytics (e.g. Cloudflare Logs)
  • Log stream from hosting provider (e.g. PHP error log)

To see where you can take this, you might also be interested in reading this blog post that shows off some dashboards that were purpose built for a high traffic event.

An example dashboard that was written for a previous high traffic event that I was involved with. The data is around 6 minutes delayed, but still proved invaluable.Step 10) Application changes in a pinch

If you do spot something in your analytics, knowing what tools you have at your disposal to mitigate issues quickly and easily is worth knowing.

  • Cloudflare page rules (redirect a broken path, increase the WAF presence on a route)
  • Nginx or Apache configuration
  • Application hotfix (avoid clearing the cache)

Knowing what tool will solve what problem, how long each option takes to deploy, how safe it is, how easy is the rollback is is absolutely critical.

Cloudflare's pagerules feature is an excellent way to make quick changes to how your application functions.Step 11) Letting your hosting provider and their support team know

No-one likes surprises, so plan ahead. Ensure there are people available or on call during your traffic event. This goes for both your hosting provider, to CDN provider to support staff.

Postamble: What success looks like

So after your high traffic event has ended, here are some simple things to check in order to see how successful you were:

  • Minimal origin requests and a high CDN offload
  • Boring origin hardware graphs
  • No rants on twitter
  • No trending hashtag on twitter that is negative
  • Users remember the event for it's content, and not the problems with it
Drupalsouth 2019 videoThis was me presenting this topic at Drupalsouth 2019.

Let me know in the comments if this was of use, and also if you have any other words of wisdom for anyone else.

A Step-by-step guide to integrating your BigCommerce store with the Drupal CMS


The BigCommerce for Drupal module, created by Acro Media in partnership with BigCommerce, was released early this year and brings together two different platforms – BigCommerce, the open SaaS ecommerce platform, and Drupal, the open source content management system. The result provides a wonderful new way for retailers to implement an innovative and content rich headless ecommerce strategy. If you use one and would like to have the capabilities of the other, the BigCommerce for Drupal module is the bridge you need. With this module, you can use Drupal as the powerful front-end CMS with BigCommerce as the easy-to-use and scalable ecommerce backend.

This post is a step-by-step guide for people who want to know how to install the BigCommerce for Drupal module and get started with both platforms. If you just want to know more about the BigCommerce and Drupal together as ecommerce solution, check out this post instead.

How this module works

Here’s a quick overview of how this all works. The BigCommerce for Drupal module integrates BigCommerce and Drupal together, but each platform is still used for different tasks.

In BigCommerce, you configure products, categories, shipping, taxes and everything else for the ecommerce side of your site. BigCommerce is also where you go to manage orders as they come in.

Drupal is then used for the website frontend and theming. Product and category information from BigCommerce are synced to Drupal, importing them as Drupal Commerce products so that they can be displayed and used like any other Drupal-based content. Any non-commerce content is also managed within Drupal. When a customer goes to checkout, a BigCommerce checkout pane is embedding in the Drupal site to securely process payment and save customer and order information.

Setup BigCommerce and Drupal

On to the guide! Follow these steps and you’ll have your BigCommerce and Drupal store configured in no time!

Prerequisites

This guide already assumes that you have the following ready.

  1. A BigCommerce account and store created
    You will need to create a BigCommerce account with at least one product, shipping method and payment method configured in your BigCommerce store. Do this here, not in Drupal.

    NOTE: BigCommerce currently offers a 14-day trial period, so any one can go and create and configure a store easily for free. For this demo, I signed up for that and created some random products to use for testing.

  2. A working Drupal 8 site
    You should have a Drupal 8 site with the Commerce module enabled and a default store added (via Commerce > Configuration > Store > Stores). You don’t need to do any other setup here yet or enable any of the other Commerce modules like checkout or payment. BigCommerce is going to handle all of this for you.

  3. An SSL certificate for your Drupal site
    Your Drupal website needs to have an SSL certificate active for the BigCommerce checkout form to render. This is required because it ensures security for your customers at checkout, so make sure you install one.
BigCommerce for Drupal setup guide

With the prerequisites done, here’s what you need to do to the the BigCommerce for Drupal connection made.

Step 1: Create a BigCommerce API account
  1. Go to your BigCommerce store admin page and navigate to Advanced Settings > API Accounts.

  2. Click on “Create API Account” button and select “Create V3/V2 API Token”.


    Fig: BigCommerce Store API Accounts page

  3. Provide a name (i.e. Product Sync) and select the scope for each features (i.e. if you don’t want the ability for the Drupal admin to modify product information, you can set the scope for “Products” as “read-only”).


    Fig: API configuration in BigCommerce

  4. Click “Save” to save your changes. Once saved, you will see a summary and a prompt to download a file. Download it and keep it safe. Once you create an API account, you can’t modify the keys (but you can always make a new one).


    Fig: BigCommerce API Credentials dialog box
Step 2: Download and configure the BigCommerce for Drupal module
  1. Get and install the BigCommerce for Drupal module.

    TIP: This module requires a bunch of other modules to work. To get the BigCommerce for Drupal module and all of its dependencies at the same time it’s recommended to use Composer instead of manually downloading it. Running the following command within your Composer based Drupal project will get everything you need.

    composer require drupal/bigcommerce
  2. In Drupal, navigate to module configuration page at Commerce > Configuration > BigCommerce > BigCommerce Settings.
    1. Fill in the API Path, Client ID, Secret Key, and Access Token that you received when creating the BigCommerce API.

    2. Hit “Save”. If everything is correct, you will see a message saying “Connected Successfully”.


      Fig: BigCommerce Configuration page in Drupal site
  3. Next we configure the Channel Settings. This will create a storefront url for you in BigCommerce which will match the one that is generated on the Drupal side.

    1. Select “Add new channel” from the select channel list.

    2. Provide a channel name.

    3. Click the “Create new BigCommerce channel” button. You will then see a Site ID and Site URL on the setting page.


      Fig: BigCommerce configuration page in Drupal
  4. Now in the same Channel Settings area, click on the “Update BigCommerce Site URL” button. This lets you confirm that the url generated is actually sent to the BigCommerce, otherwise the checkout form will not be loaded on your Drupal site.

    You can also confirm the channel connection in from within the BigCommerce admin dashboard by visiting the Channel Manager admin page.


    Fig: Channel Manager storefront confirmation in BigCommerce
Step 3 : Sync products, variations and taxonomies from BigCommerce
  1. In Drupal, navigate to the product synchronization page at at Commerce > Configuration > BigCommerce > BigCommerce Product Synchronization.
  2. Click the “Sync Products from BigCommerce” button and ta-da, all the products, variations, and categories will be synced to your Drupal site in an instant.

    Alternately, you can also synchronize via the following Drush command. Advanced Drupal users can use this command on cron to do automatic syncing.

    drush migrate:import --group bigcommerce
    Fig: Product Synchronization page


    Fig: Syncing from BigCommerce in progress

    NOTE: If you run into errors when syncing products, it probably because you don’t have a store added in the Drupal Commerce module yet. Add one at Commerce > Configuration > Store > Stores.

    TIP: Any time you make changes to the products in BigCommerce, visit this page or use the Drush command to synchronize the changes. Before syncing, you’ll also see a message telling you that updates are available.

  3. Confirm the products have synced by visiting the Product page for Drupal Commerce at Commerce > Products. A list of all of the products brought in from BigCommerce will appear here.
Step 4 : See the BigCommerce checkout in action
  1. Now that everything is set up, go to a product page, and it to your cart and proceed to checkout.

    If everything was done correctly, you will be able to see the BigCommerce checkout form embedded in to your Drupal site! Hurray! All of the shipping methods, payment methods, tax calculations, and other BigCommerce store configurations will be seen in the embedded form here.

    If you don’t see the checkout form make sure that your channels settings are correct that that you have an SSL certificate installed.


    Fig: Drupal’s checkout page with embedded BigCommerce checkout form


    Fig: Drupal’s checkout page after order complete

  2. Once an order has been placed, the order information will be stored in Drupal (at Commerce > Orders) and will also be sent to BigCommerce (at Orders > View).


    Fig: BigCommerce backend View Orders page
Additional notes

The BigCommerce for Drupal module is ready for production and available for all to use. When writing this guide, there were some additional notes that I wanted to share.

  • At this time, product management should always be handled within BigCommerce and then synced to Drupal. Currently there is no option to bring back a product if you delete it in the Drupal side, so be careful.
  • A development roadmap for the module can be found here. It outlines future features and plans.
  • If you use the module and find any bugs or want specific features, please add them to the module issue queue here.
Acro Media is a BigCommerce Elite Partner

Acro Media is the development team partnered with BigCommerce that made the BigCommerce for Drupal module a reality. We have many, many years of ecommerce consulting and development experience available to support your team too. If you’re interested in exploring Drupal, BigCommerce or both for your online store, we’d love to talk.

So let us explain in this post why we think it’s best to create an e-commerce website with Drupal.

With the Views module, it is possible to fetch pieces of data from a Drupal entity and display them according to a specific format. The Views module acts as a query builder, which generates the SQL code, in charge of retrieving the data from the database.

The first setting when creating a view allows you to choose the base table from which the aforementioned data will be recovered.

When you choose Content as the base table, you also have to specify the Content type, so it will not be possible to retrieve data from other content types and present it within the view unless you set a Relationship between those content types.

This tutorial will explain the concept of Relationships in Views with a basic example. 

Let’s start!

How these Drupal SEO Modules (and tips) can boost your Website Ranking Shefali Shetty 03 Dec, 2019 Top 10 best practices for designing a perfect UX for your mobile app

With our Drupal SEO guide, you don’t have to go too far in search of SEO tips, features and modules to boost your Drupal website ranking. Read and bookmark this for future reference!

We all know Drupal as this robust, flexible and dependable CMS platform but not many realize its abilities when it comes to SEO.
SEO can be one of the most influential factors for the success of an organization or a business. Did you know that 93% of online experiences start with search engines and 51% of all website traffic is attributed to organic searches (source)?! 
When people from around the globe search for your product or services, you want to appear as high as possible on the search engine ranking. Improving your SERP (Search Engine Results Page) ranking is now more important than any other marketing strategy. Ideally, Drupal SEO is about making your website easy for both your visitors and the search engine crawlers to understand. Thus, your SEO strategy begins long before your Drupal website is built.
With an effective long-term perspective, its amazing collection of modules, its flexibility and customization options and not to forget, the loyal community working towards the betterment of the open source platform, Drupal 8 tops the CMS chart. Drupal 8 offers some exemplary SEO modules that can boost your Drupal 8 SEO and SERP ranking.

Lay your Foundation

With Drupal 8, most of the required SEO best practices are already embedded into the core of the platform and with a little knowledge of SEO and some must-have configurations, anybody can possibly boost their website’s SEO to drive more traffic. But before you jump right into the tools and other configurations on Drupal 8, sit back and think about the basics which you need to get right for your website's success.

Things to remember -
  • With your website goals in sight and your visitors' interests in mind, sit down and form a proper pre-SEO strategy to begin with. 
  • Know your audience before you begin with your Drupal 8 SEO strategy. Once your audience set is established, shooting targeted content will be easier.
  • Pay attention to every detail that goes into your On-page SEO (content, improve keyword ranking), Off-page SEO (backlinks) and Technical SEO (site architecture, UX, clean code). 
  • Website performance and site speed directly contribute to a website’s SEO ranking factor. This is not just true because of Google’s algorithm update for speed but also because a slow loading website hampers user experience.
  • Yes, come 2019 and Google has changed the way SEO used to work previously. The focus is more now on the value you bring to your users – mainly through content. Hence, getting your Content strategy right is very crucial for a good website ranking.
  • Google favors websites that make it easy for users to navigate and access irrespective of the device. With Drupal 8, you can be assured of a responsive website because of its Mobile-first approach. 
  • Remember that SEO works differently for different businesses. There is no one-size-fits-all when it comes to crafting an SEO strategy for your Drupal website.
Drupal SEO Modules

With Drupal 8 and the continuous innovation approach, it has adopted some of the best-in-class technologies thus making it more future-proof. More importantly, it has made content authoring more powerful and easier at the same time. One of the great things about Drupal 8 is that it is "SEO ready" right out of the box. To begin with, let us talk about some important Drupal SEO modules that help enhance your SEO efforts on your Drupal website.

1.Drupal SEO Checklist Module

If you are a ‘To-do-list’ person who is in love with organizing things, this module is for you. This is an important module which does not directly affect your Drupal SEO and improve the SERP but will provide critical information on the changes to be made on the site. The Drupal SEO Checklist module checks your entire Drupal website for proper search engine optimization against SEO best practices and tells you what to do. It provides a detailed report on what needs to be done to improve the performance of your Drupal website.


          Drupal SEO Checklist

It keeps a track of how tasks have been taken care of, what has already been done (with timestamps) and what needs to be attended to. If a task needs you to install a module, it provides with a link to download it as well. This data provides a report that can be used for further monitoring.

2.Pathauto Module

Using clean URLs that indicate what content it represents is extremely important for SEO. One of the most important and useful modules for Drupal SEO, Pathautho plays a major role in creating SEO-friendly URLs on your website. The usual "example[dot]com/node/1" can be replaced with more SEO specific URL aliases such as "example[dot]com/page/keyword". Based on the category of your website page or based on the page title, with Pathauto you can build URLs which are SEO friendly. These intuitive URLs are easy for the visitors on your site to understand what they are looking at and where they are, which helps in improving your site ranking on search engines.
 


         Pathauto Module

3.Google Analytics Module

In our recent post, we discussed how you can add Google Analytics module to your Drupal website and also create custom reports for better performance of your website. Though the Google Analytics Drupal SEO module does not have any direct effect on your Drupal SEO or does not improve your ranking, it plays a major role in providing the necessary information that can amplify your Drupal 8 website's SEO success. By tracking your visitors, their behavior and interests within your site, you can change or add new strategies to drive more traffic and increase conversions. You can also use the GA Reports module that can provide you with a graphical representation and details reports about your Drupal website.


        Google Analytics Module

4.Global Redirect Module

Google certainly does not like spammy duplicate content on a web page. Such content can have a negative impact on your SEO efforts and as a result, can harm your rankings on the search engine. On Drupal, while you are happy creating clean URLs with the alias system, you should note that there is a small problem arising. With the creation of new URLs, the default URL still exists and the search engines do not see it as a good sign.

         Global Redirect module

The Global redirect module helps in rectifying this problem by verifying the existence of an alias for a URL and redirecting it to the same. The module also plays a role in checking the URL implementation and permission or the access required to the nodes and URLs. However, this module has been deprecated for Drupal 8 and the functionalities are now merged into the Redirect module.

5. SEO Compliance Checker Module

A module which is of great help to Drupal SEO beginners and webmasters, SEO Compliance Checker performs a complete check when a node is created or modified on your Drupal site. The execution includes checking if the titles and meta tags are optimized, no alt tags are missing, optimized keywords usage and other important factors for better SEO and to improve keyword ranking.


           SEO Compliance Checker Module

While the core module - seo_checker does not perform any of these checks, it is the submodules that come along (basic_seo_rules.module and keyword_rules.module) which execute these checks for the implementation of some basic SEO rules. The core Drupal module on the other hand, gathers the required information about the checks to be performed and applies them to collect the results.
In addition to these modules, there are many other Drupal 8 SEO modules such as the Page Title module which allows the page title to be set, the Meta tag module which equips you with complete control of meta tags on your Drupal website, and the XML Sitemap module to create a search engine readable, dynamic sitemaps.

Some Additional Tips

Just to let you know, in the time you took to read the blog until this point,

  • More than a million Google searches were made.
  • Close to 20,000 Facebook posts were posted.
  • More than 1000 blogs were posted on the internet.

The internet is noisier than ever now, and it doesn't seem like it is going to stop anytime soon. SEO at its beginning stages was all about cramming your website with keywords and let the crawlers do their magic. But over time, SEO has blossomed with Google introducing some amazing algorithms and updates to curb the black hat SEO practices.
Right now, for a successful Drupal 8 SEO campaign, you need to study your visitors, their behavior and interests on your website and curate the content accordingly to stay ahead of the curve. You don't want to be in the midst of a content gap which can create problems for your website's conversions. With this study, you have a real opportunity in hand to develop fresh content for your Drupal website and optimize it for the searches on the search engine.
 

With Drupal, using the right SEO modules can take your website to the top of search engine rankings to set you up for success. But not everybody knows how the Google algorithms work and when the team is going to make the next change that could affect your Drupal website. So, stick to your basics, avoid duplicate content, keyword stuffing, use human-friendly URLs and create your website in a way that your visitors find it quite helpful. Partner with the right 
Drupal development company who will design your Drupal strategy carefully and use the best Drupal SEO modules for your website.

Drupal Planet Shefali ShettyApr 05, 2017 Subscribe For Our Newsletter And Stay Updated Subscribe

Leave us a Comment

  Shefali ShettyApr 05, 2017 Recent Posts Image How these Drupal SEO Modules (and tips) can boost your Website Ranking Image A quick guide to Drupal Sitemaps (and why you need one) Image Spam prevention guidelines for your Drupal Website Looking to optimize your Drupal site for SEO in the best way possible ? Talk to us Featured Success Stories

Know more about our technology driven approach to recreate the content management workflow for [24]7.ai

link

Find out how we transformed the digital image of world’s largest healthcare provider, an attribute that defined their global presence in the medical world.

link

Develop an internal portal aimed at encouraging sellers at Flipkart to obtain latest insights with respect to a particular domain.

link

About the Webform module

The Webform module is a form builder and submission manager for Drupal. The Webform module allows site builders to customize forms and route submissions to multiple systems, including email, remote servers, and spreadsheets.

About the Group module

The Group module allows site builders to create arbitrary collections of content and users on a site and then grant users access to these collections. The Group module allows sites to build communities and manage their organization by creating groups of users with various levels of membership.

Problem

The Webform module for Drupal 8 does not fully integrate with the Group module. In Drupal 8, webforms are configuration entities, however, the Group module currently only supports content entities. To learn more about this issue, see Issue #2856333: Webform as group content.

Webforms can be attached to nodes, which are content entities which are supported by the Group module. This approach only provides access controls to determine which group members can simply view and submit a webform. It does not provide webform submission and element level access controls.

For example, a group could have a dedicated event registration system with a dedicated registration webform. A group administrator can control who has access to the event registration webform but there is no mechanism to allow group-specific roles to view, manage, and delete event registrations.

Solution

Provide webform submission and element level access controls to webforms attached to nodes.

Existing access controls

The Webform module already provides roles, user, and permission level access controls to webforms,...Read More

Drupal VM 5.1.0 was just released (release name Recognizer), and the main feature is PHP 7.4 support; you can now begin running and testing your Drupal sites under PHP 7.4 to check for any incompatibilities.

PHP 7.4 includes some new features like typed properties, arrow functions, and opcache preloading which could help with certain types of code or site deployments (I'm interested to see if opcache preloading could help the startup time of Drupal inside container environments like Kubernetes!).

What our clients are saying

A great experience and a much improved website.
Thanks so much for everything!
I love directing our customers to our new site knowing that they are going to be able to find exactly what they are looking for...
I realized that I had picked the right company to work with soon after beginning a project with Peerless Design, Inc.
I would highly recommend her for any position requiring IT design and development
...a pleasure to work with, combining patience (for my busy schedule and at times overwhelmed brain) with her strong motivation and energy to keep me going
...continued to monitor it closely and is still always available to help me if I have any questions
...we just want you to know that we are appreciative!
I have seen the first layouts and they are awesome...
...your punctuality, your casual and open personalities, and both your hard copy and online portfolios speak very highly of you and your business as well
...dedicated, competent and driven to get the job done and done well.
... they also made suggestions which showed me that they fully understood what I wanted to accomplish.
...I have no doubt we will have the best site in the 2010 election of any PA candidate
...provided us with excellent, expert service in a professional and personable manner.
" PDI provides us prompt, effective and efficient service in maintaining our Drupal based website."
I'm so happy we chose to work with PEERLESS Design.
... incredibly impressed with what you brought to the table
I would highly recommend her for any position requiring IT design and development
...can do anything any other designer can do and generally quicker, cheaper and better.
...took my less than mediocre site and completely revamped it into a beautiful, professional, and easy-to-navigate site
...creative, independent, responsive...
...able to take my abstract ideas and add their expertise to bring them to life in a way that was better than I could have imagined!
I had a very tight deadline and budget, and they met it, seemingly with ease.
...able to translate technical information in an accessible way...
...very responsive to our questions and needs