Facebook is tweaking the algorithms that decide which stories and posts you see in your News Feed, with a greater emphasis on how long you spend reading one kind of post relative to others.

Previously, the stories that you'd see on your Facebook home page were primarily determined by how many Likes, Shares or comments they received by your friends and other users. But that method didn't necessarily highlight the content that was most "meaningful" to users, according to a blog post on Facebook's company news site.

"There are times when, for example, people want to see information about a serious current event, but don't necessarily want to like or comment on it," wrote software engineers Ansha Yu and Sami Tas.

The News Feed in its current form works by prominently displaying stories, photos, videos and status updates from your friends and product or brand pages you follow, based on what Facebook's algorithm determines you'll be most interested in.

Other activity from friends and pages are relegated to the smaller feed on the right side of the page.

Facebook Newsfeed June 2015

The number of Facebook friends you have sharing a particular piece of content increases the likelihood that you'll see it in your own News Feed. Activity that isn't deemed as relevant for you is relegated to a smaller feed on the right side of the page. (CBC News/Screenshot)

Strictly measuring how long a post appears on your screen isn't useful without comparing how long you spend on other posts, however.

"Some people may spend ten seconds on a story because they really enjoy it, while others may spend ten seconds on a story because they have a slow internet connection," said Facebook.

Updated algorithms will now compare how long you spend looking at a post compared to other posts you've lingered on in your Feed.

For example, it'll give priority to the types of posts that spend more time on your screen compared to those that you've scrolled past in seconds.

Developers have been experimenting with changes to the News Feed algorithms over the past several months, including showing posts with more diverse political or ideological perspectives, and filtering out "hoax" stories.

In July 2014, Facebook CEO Sheryl Sandberg apologized after it was found that the site was experimenting with how many positive and negative stories it showed on the News Feeds of about 700,000 users to gauge whether it provoked similar emotional responses.