Sitecore indexes is very powerful for getting various items fast, especially when they are located in various places. There are also some pitfalls that one needs to be aware of. This post covers methods that could be considered in some scenarios. In this post I’ll describe how I reduced indexing time from around three hours down to two minutes for a specific scenario.Continue reading
There is a dll version conflict for Polly.dll between Sitecore CMP connector and Sitecore Publish Service 4.1.0. The Publishing Service Module comes with Polly.dll version 5.9.0 and the Sitecore Connect for CMP 1.0.0 comes with Polly 6.0.1. This will cause Sitecore CM to stop working during the CMP install, if the SPS module is already in the system.
Update: As Sitecore 9.3 and SPS 4.2.0 was just released a few hours after writing this post, I noticed this applies to that version as well. SPS module 9.3 also comes with Polly 5.9.0.
The easy fix for this is to add assembly redirects into the web.config file before installing the connector and keep the old file: Update: It turned out I tricked myself. I thought I got everything working, but the CMP connector throws exceptions in the log while importing content from Content Hub.
<dependentAssembly> <assemblyIdentity name="Polly" /><!-- publicKeyToken="c8a3ffc3f8f825cc" --> <bindingRedirect oldVersion="220.127.116.11-18.104.22.168" newVersion="22.214.171.124"/> </dependentAssembly>
As described in this stackoverflow post, it’s not possible to do assembly redirect between assemblies with different public keys. Polly 5.9 doesn’t have a public key (i.e. it’s null), so at least I couldn’t make binding work to the newer 6.0 version.
As of writing this, I don’t have a solution to this problem. I’m currently awaiting an answer from Sitecore if CMP and SPS can live together or not.
We’ve discovered a rare issue in Sitecore Publish Service (SPS) where it may publish incorrect content to some fields. Even though I think SPS does this wrong, the root cause was inconsistent data in the master database. It turned out such inconsistency exist in most databases, even in a clean Sitecore install.Continue reading
This week I’ve been speaking at the Sitecore developer meeting in Göteborg. My speech was a live demo of a quite different way of using Sitecore.
We at Stendahls have created an event solution where we’ve throwed away PowerPoint and uses a web browser instead for presentation. The audience each have an iPad mini as a second screen and the speakers controls everything from an iPad instead of using a clicker.
Everything is driven by Sitecore and SignalR and we had this setup for 120 iPads and about ten presentation screens. It worked extremely well and just imagine what you can do with DMS, ECM, Analytics and personalization when the audience are all logged in as a Sitecore user using a pin code from their badge. 🙂
I hope I’ll be doing the same speech at the next Sitecore developer meeting in Malmö.
There seems to be tons of arguments whether you should store your Media Library files in the database or as files on disk. I usually prefer storing them in the database, since it’s usually easier to manage the data this way. The performance impact is virtually none anyway, since I always use a CDN. But I agree with many of the cons arguments about this as well, so I’ll leave that decision to you.
If you choose database storage, one thing worth considering, is moving the Blobs table into another file store, so that you can put the binaries on a cheaper disk volume. That is if you for example run your primary database on SSD disks, it’d work just fine having the blobs on conventional hard disks.
We tend to spend quite some time on URL management in order to get them nice, SEO friendly in all sorts of ways, but I’ve noticed that most CMS I’ve been in contact with have overseen one of the areas that I think is one of the most important ones. That is proper handling of permanent redirect (301) of URL’s when they change.
The URL structure should of course reflect the site structure, have relevant keywords in it etc. When content authors work with the site, those URL’s may change. The old ones give a “404 Not Found”. Bad from a SEO perspective but it also means current visitors on the site will have downloaded html pages with broken links. My experience is that this escalates when having integrations to other systems that generates items.
This is a follow up on yesterdays post regarding autoinstalling NuGet packages into Sitecore.
On request, here is my prototype code on how I use WebActivator to automatically insert/update embedded items into a Sitecore installation. Please note that this is very much prototype code, and it’s really ugly too. But I hope you get the idea of how it works.
I’ve been looking some time for a more streamlined development process when building Sitecore modules and share code between Sitecore projects. Separating code into modules is easy, but when you have dependencies on Sitecore items, things becomes a bit more complicated.
The package installation wizard in Sitecore is helpful, and it get’s better with update packages generated by Hedgehog TDS. But I think we can do better. With multiple developers, multiple installations (dev, test, QA, live environments etc), the update process becomes too time consuming and painful.
In addition, I want to be able to create generic modules that can be shared between different Sitecore projects, so I want the code to be built and tested on our build servers and deployed to our local NuGet server.
I’ve done a prototype of this, and I think it worked out quite nice, though there is still a lot of room for improvements.
When working on large scale web application, typically hosted on multiple content delivery servers, regardless if it’s due to high load or a requirement for redundancy, there are quite a few pitfalls to handle.
One of them is on-demand, content delivery web server generated content. My most common case are ordinary web images. Images are typically selected by content authors and they don’t know anything about pixels, bandwidth etc. The people responsible of that are the web developers. The allowed image size constraints should be specified in the aspx/ascx/cshtml files. This means that you utilize some sort of function to resize the image on request of that image and you cache the result, typically as a file on the content deliver web server.
Since you’re good internet citizen, you have version control of your generated images as part of the image URL and you use long time client caching. And since you need speed, you use a Content Delivery Network (CDN) as well. Here I assume you use the site as CDN origin, i.e the images aren’t uploaded to a CDN storage.
Now, using multiple content delivery servers behind a load balancer, this usually causes problems. Let’s say you update something that’ll change the image and its URL (the versioning part). During the publishing sequence of your CMS, you’ll have a state where some content delivery servers are updated and some are not. The time may be short, but it’ll be there (*).
Since we’re talking large scale here, you will now have a problem. A visitor to your newly published site will get an updated html page with the new image URL. The client browser will then load the new image, typically through the CDN, and that request eventually goes into the content delivery cluster. Where will it end up? Possibly (or probably if your load balancer considers backend latency) on a server that hasn’t got the new image yet. We’re still in the middle of a publishing process, right. So, depending on your implementation, you’ll serve an old image, generate a new image based on old data, or give a 404. All are bad.
To make things worse, it’ll probably get cached by your CDN, so now all visitors will get an incorrect image.
Serving your Sitecore Media items, on public sites, through a Content Delivery Network (CDN) is always good. I like using Amazon CloudFront since it’s really cheap and easy to set up. For large, high volume sites, I’d look at alternatives as well. But since there is no startup cost or fixed monthly fees etc using CloudFront, I think all small and mid size Sitecore setups should leverage from it.
There are quite many tutorials out there on how to configure Sitecore for CloudFront, so I won’t get into that here. Maybe I’ll post something later..
Sometimes you end up with pages on your site that’s accessible from several different URL’s. It might not be a big deal, but from an SEO perspective it’s bad. Each page should have only one URL, otherwise Google and other search engines may treat your pages as duplicate content, resulting in a lower page rank etc etc.
In ASP.NET, URL’s are mostly not case sensitive, but a URL is actually case sensitive (see w3.org). This means you may accidentally refer to a page using different casing and thereby have duplicate content on your site.
After lots of frustration getting Combres to work properly with Sitecore 6.6, I got very good help from the Sitecore support team and a solution worth sharing with the community.
The problem is that since the first release of Sitecore 6.6, you cannot register a route to a synchronous IHttpHandler. (In the developer preview of 6.6 and in previews versions you could.) Instead you get this quite confusing error: