Many of us have heard that posting consistently is one of the most important things you can do to grow your audience on social media, and in this analysis we’ll attempt to ascertain whether that is true and quantify the effect that posting frequency actually has.
We’ll start the analysis by looking at follower growth. What effect does posting frequency have on follower growth?
Conclusions
This analysis of 4.8 million channel-week observations across Facebook, Instagram, and X provides clear evidence that posting frequency directly impacts follower growth. Tthe Z-score analysis and fixed effects regression model tell the same story: posting more leads to better performance.
The data reveals several key insights to keep in mind. First, there’s a penalty for not posting. Channels that don’t post in a given week consistently underperform their baseline growth rates, which suggests that people and algorithms reward active accounts and penalize inactive ones.
Along those lines, any posting is way better than no posting. Even channels that post just 1-2 times per week see meaningful improvements over weeks when they don’t post at all. To me this suggests that consistency matters more than anything. There’s a real cost to not posting anything.
Third, the benefits of posting more appear to grow at higher posting frequencies rather than diminish. Channels posting 10+ times per week see the largest follower growth, adding an average of 32 additional followers compared to weeks with no posts.
The fixed effects model is particularly compelling because it compares each channel against itself over time, controlling for all the factors that make channels different from each other. When the same channel posts more frequently, it consistently grows faster. This gives us confidence that posting frequency is actually driving growth, not just correlated with it.
For creators, this analysis supports a strategy of frequent and consistent posting. While the optimal frequency varies by platform and audience, the data clearly shows that more active channels outperform less active ones. The question isn’t whether to post frequently - it’s how frequently you can post while maintaining quality and sustainability.
Posting Frequency and Follower Growth
Buffer collects followers data for Instagram, Facebook, and X profiles that have the analytics entitlement, meaning they belong to users that have a paid plan or started a trial. We also collect data for LinkedIn pages, but not personal profiles. For that reason, we’ll focus on Facebook, Instagram, and X audience growth.
The SQL query below returns approximately 4.8 million records that include the number of posts each channel creates per week as well as their weekly follower growth. Approximately 161k profiles are included in this dataset.
Code
sql <-"with profile_activity_windows as ( -- Find the first and last week of posting activity for each Facebook profile select p.service_id as channel_id, 'facebook' as service, min(timestamp_trunc(up.sent_at, week)) as first_post_week, max(timestamp_trunc(up.sent_at, week)) as last_post_week from dbt_buffer.publish_updates as up inner join dbt_buffer.publish_profiles as p on up.profile_id = p.id where up.profile_service = 'facebook' and up.sent_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1, 2 union all -- Find the first and last week of posting activity for each Twitter profile select p.service_id as channel_id, 'twitter' as service, min(timestamp_trunc(up.sent_at, week)) as first_post_week, max(timestamp_trunc(up.sent_at, week)) as last_post_week from dbt_buffer.publish_updates as up inner join dbt_buffer.publish_profiles as p on up.profile_id = p.id where up.profile_service = 'twitter' and up.sent_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1, 2),weekly_facebook_posts as ( select p.service_id as channel_id , 'facebook' as service , timestamp_trunc(up.sent_at, week) as week , count(distinct up.id) as posts from dbt_buffer.publish_updates as up inner join dbt_buffer.publish_profiles as p on up.profile_id = p.id where up.profile_service = 'facebook' and up.sent_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_instagram_posts as ( select service_id as channel_id , 'instagram' as service , timestamp_trunc(created_at, week) as week , count(distinct service_update_id) as posts from dbt_buffer.analyze_instagram_analytics_user_media_totals where created_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_twitter_posts as ( select p.service_id as channel_id , 'twitter' as service , timestamp_trunc(up.sent_at, week) as week , count(distinct up.id) as posts from dbt_buffer.publish_updates as up inner join dbt_buffer.publish_profiles as p on up.profile_id = p.id where up.profile_service = 'twitter' and up.sent_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_facebook_followers_base as ( select service_id as channel_id , 'facebook' as service , timestamp_trunc(checked_at, week) as week , max(fans) as week_end_followers from dbt_buffer.analyze_facebook_analytics_page_daily_totals where checked_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_facebook_followers as ( select channel_id , service , week , week_end_followers , lag(week_end_followers) over (partition by channel_id order by week) as prev_week_followers from weekly_facebook_followers_base),weekly_instagram_followers_base as ( select service_id as channel_id , 'instagram' as service , timestamp_trunc(checked_at, week) as week , max(followers_count) as week_end_followers from dbt_buffer.analyze_instagram_analytics_user_daily_totals where checked_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_instagram_followers as ( select channel_id , service , week , week_end_followers , lag(week_end_followers) over (partition by channel_id order by week) as prev_week_followers from weekly_instagram_followers_base),weekly_twitter_followers_base as ( select service_id as channel_id , 'twitter' as service , timestamp_trunc(checked_at, week) as week , max(followers_count) as week_end_followers from dbt_buffer.analyze_twitter_analytics_user_daily_totals where checked_at >= timestamp_sub(current_timestamp, interval 365 day) group by 1,2,3),weekly_twitter_followers as ( select channel_id , service , week , week_end_followers , lag(week_end_followers) over (partition by channel_id order by week) as prev_week_followers from weekly_twitter_followers_base),combined_data as ( select f.channel_id , f.service , f.week , coalesce(p.posts, 0) as posts , f.week_end_followers , f.prev_week_followers , case when f.prev_week_followers > 0 then ((f.week_end_followers - f.prev_week_followers) * 100.0 / f.prev_week_followers) else null end as follower_growth_pct , f.week_end_followers - f.prev_week_followers as follower_growth_absolute from weekly_facebook_followers f left join weekly_facebook_posts p on f.channel_id = p.channel_id and f.week = p.week -- Only include weeks within the active posting window inner join profile_activity_windows paw on f.channel_id = paw.channel_id and f.service = paw.service and f.week >= paw.first_post_week and f.week <= paw.last_post_week union all select f.channel_id , f.service , f.week , coalesce(p.posts, 0) as posts , f.week_end_followers , f.prev_week_followers , case when f.prev_week_followers > 0 then ((f.week_end_followers - f.prev_week_followers) * 100.0 / f.prev_week_followers) else null end as follower_growth_pct , f.week_end_followers - f.prev_week_followers as follower_growth_absolute from weekly_instagram_followers f left join weekly_instagram_posts p on f.channel_id = p.channel_id and f.week = p.week -- Note: Instagram uses a different data source, so no activity window filtering here -- unless you want to create a similar window based on the instagram posts data union all select f.channel_id , f.service , f.week , coalesce(p.posts, 0) as posts , f.week_end_followers , f.prev_week_followers , case when f.prev_week_followers > 0 then ((f.week_end_followers - f.prev_week_followers) * 100.0 / f.prev_week_followers) else null end as follower_growth_pct , f.week_end_followers - f.prev_week_followers as follower_growth_absolute from weekly_twitter_followers f left join weekly_twitter_posts p on f.channel_id = p.channel_id and f.week = p.week -- Only include weeks within the active posting window inner join profile_activity_windows paw on f.channel_id = paw.channel_id and f.service = paw.service and f.week >= paw.first_post_week and f.week <= paw.last_post_week)select service , channel_id , week , posts , week_end_followers , prev_week_followers , follower_growth_pct , follower_growth_absolute , case when posts = 0 then 'No Posts' when posts between 1 and 2 then '1-2 Posts' when posts between 3 and 5 then '3-5 Posts' when posts between 6 and 10 then '6-10 Posts' when posts > 10 then '10+ Posts' end as posting_frequency_binfrom combined_datawhere prev_week_followers is not nullorder by service, channel_id, week"# get data from BigQueryposts <-bq_query(sql = sql)
We can use the skim function from the skimr package to get a summary of this dataset.
No : 2539606, 1-2: 1061589, 3-5: 621423, 6-1: 279856
Variable type: numeric
skim_variable
n_missing
complete_rate
mean
sd
p0
p25
p50
p75
p100
hist
posts
0
1
3.68
25.20
0
0
0
2.00
4695
▇▁▁▁▁
week_end_followers
0
1
21708.63
337551.29
-1938
243
1044
4100.00
56666742
▇▁▁▁▁
prev_week_followers
0
1
21681.23
337541.27
1
242
1040
4086.00
56699599
▇▁▁▁▁
follower_growth_pct
0
1
4.39
1263.37
-14500
0
0
0.23
1496600
▇▁▁▁▁
follower_growth_absolute
0
1
27.40
999.91
-1343728
0
0
3.00
418141
▁▁▁▇▁
Variable type: POSIXct
skim_variable
n_missing
complete_rate
min
max
median
n_unique
week
0
1
2024-06-30
2025-06-22
2024-11-24
52
Next we’ll calculate the average follower growth rate by the number of posts sent in a given week and plot them for each network.
The plot a strong positive correlation for each network. It also shows a potential penalty for not posting, particularly for X profiles. This is the sort of correlation we might expect, but we still want to control for the differences in profiles’ natural growth rates.
Code
# summary statsposts %>%group_by(service, posting_frequency_bin) %>%summarise(avg_growth =mean(follower_growth_pct, na.rm = T)) %>%ggplot(aes(x = posting_frequency_bin, y = avg_growth, fill = service)) +geom_col(show.legend =FALSE) +facet_wrap(~service, scales ="free_y") +labs(x ="Weekly Number of Posts",y =NULL,title ="Average Follower Growth Rate by Posting Frequency",subtitle ="Profiles that post more frequently tend to gain more followers on average")
To control for these difference we’ll first utilize Z-scores. This is the approach to calculate a Z-score, which we’ll do for each channel:
Calculate the mean and standard deviation of follower growth rates across all weeks
Calculate a Z-score for each week. Z = (follower growth - average growth) ÷ standard deviation.
For each channel and week, the Z-score tells us how many standard deviations above or below average a channel performed in a given week:
Z = 0: Typical performance for this channel Z = +1: Strong positive week (better than ~84% of weeks) Z = +2: Exceptional week (better than ~97% of weeks) Z = -1: Poor week (worse than ~84% of weeks)
Below we calculate z-scores for each profile each week and calculate some summary statistics.
Code
# calculate channel-specific mean and standard deviationchannel_stats <- posts %>%group_by(channel_id) %>%summarise(mean_growth =mean(follower_growth_pct, na.rm =TRUE),sd_growth =sd(follower_growth_pct, na.rm =TRUE),n_weeks =n() ) %>%# only keep channels with sufficient data and variationfilter(n_weeks >=3, sd_growth >0)# merge back and calculate Z-scoresposts_with_z <- posts %>%inner_join(channel_stats, by ="channel_id") %>%mutate(z_score = (follower_growth_pct - mean_growth) / sd_growth)# calculate summary statisticsposts_with_z %>%group_by(posting_frequency_bin) %>%summarise(n_observations =n(),mean_z_score =mean(z_score, na.rm =TRUE),median_z_score =median(z_score, na.rm =TRUE) )
This data suggests that there is a positive relationship between posting frequency and follower growth. Channels that post more frequently consistently gain more followers relative to their baseline growth.
It also suggests that there is a cost to not posting. Channels that don’t post at all under-perform their baseline growth rates. We can call this the “no post penalty”. Even posting once or twice a week results in a significant increase in follower growth compared to weeks in which channels post nothing.
The difference between moderate posting (3-5 posts) and more frequent posting (6-10 posts) is relatively small, and there could even be diminishing returns for channels that post more than 10 times per week. This suggests to me that finding a sustainable posting cadence matters more than just posting as much as possible.
Below I’ve plotted average Z-scores for each platform. For each one there is a penalty for not posting, and a strong positive correlation between the number of posts sent in a given week and the number of followers gained.
Fixed Effects Regression Model
A fixed effects regression model compares the same channel against itself over time, rather than comparing different channels to each other.
Instead of asking “Do channels that post more grow faster?” (which could be biased because high-posting channels might just be better at social media), fixed effects asks “When the same channel posts more vs. less, does it grow faster?” This within-channel comparison controls for all the unchanging characteristics that make channels different from each other.
The model essentially creates a separate baseline for each channel, then measures how posting frequency affects growth relative to that channel’s own average performance.
The key advantage is that we can make stronger causal claims about posting frequency. When we see that the same channel grows faster during weeks when it posts more, we can be more confident that posting is actually driving the growth rather than just being correlated with it.
Code
# fit fixed effects modelfe_model <-feols(follower_growth_pct ~ posting_frequency_bin | channel_id, data = posts, cluster ="channel_id")# summarise modelsummary(fe_model)
These fixed effects model summary suggests that a clear, positive relationship exists between posting frequency and follower growth, with each level of posting activity driving higher follower growth.
The model shows that posting 1-2 times per week gains channels an average of around 8 additional followers compared to weeks in which they don’t post at all.
The benefits increase substantially with higher posting frequencies as well. Posting 3-5 times per week adds around 10 followers on average, while posting 6-10 times adds around 21. The most active channels posting 10+ times per week see the largest gains at 32 additional followers per week.
The progression shows larger follower gains up to the highest posting frequency. Each jump in posting activity seems to produce progressively larger benefits, for example going from 1-2 posts to 3-5 posts adds about 2.6 followers, while going from 6-10 posts to 10+ posts adds nearly 11 additional followers.
These results provide pretty clear benchmarks for a social media strategy, at least for these platforms. The data strongly supports the idea that consistent, frequent posting is a major driver of follower growth.
Conclusions
This analysis of 4.8 million channel-week observations across Facebook, Instagram, and X provides clear evidence that posting frequency directly impacts follower growth. Tthe Z-score analysis and fixed effects regression model tell the same story: posting more leads to better performance.
The data reveals several key insights to keep in mind. First, there’s a penalty for not posting. Channels that don’t post in a given week consistently underperform their baseline growth rates, which suggests that people and algorithms reward active accounts and penalize inactive ones.
Along those lines, any posting is way better than no posting. Even channels that post just 1-2 times per week see meaningful improvements over weeks when they don’t post at all. To me this suggests that consistency matters more than anything. There’s a real cost to not posting anything.
Third, the benefits of posting more appear to grow at higher posting frequencies rather than diminish. Channels posting 10+ times per week see the largest follower growth, adding an average of 32 additional followers compared to weeks with no posts.
The fixed effects model is particularly compelling because it compares each channel against itself over time, controlling for all the factors that make channels different from each other. When the same channel posts more frequently, it consistently grows faster. This gives us confidence that posting frequency is actually driving growth, not just correlated with it.
For creators, this analysis supports a strategy of frequent and consistent posting. While the optimal frequency varies by platform and audience, the data clearly shows that more active channels outperform less active ones. The question isn’t whether to post frequently - it’s how frequently you can post while maintaining quality and sustainability.
Engagement Rates
Next we’ll turn our attention to engagement rates. The SQL query below returns approximately 15.7 million posts and their engagement rates. It also includes baseline statistics for profiles like the average engagement rate, average number of posts sent per week, and the standard deviation of the number of posts sent per week.
Code
sql <-" with profile_tenure as ( -- calculate when each profile started posting and their tenure select profile_id , user_id , profile_service , min(date(sent_at)) as first_post_date , max(date(sent_at)) as last_post_date , date_diff(current_date(), min(date(sent_at)), day) as tenure_days , count(distinct date(sent_at)) as active_days , count(distinct id) as total_posts from dbt_buffer.publish_updates where sent_at >= timestamp_sub(current_timestamp, interval 730 day) and profile_service not in ('pinterest', 'tiktok', 'youtube') group by 1, 2, 3 ) , qualified_profiles as ( select distinct profile_id from profile_tenure where tenure_days >= 84 -- started posting at least 12 weeks ago and total_posts >= 20 -- minimum posts for statistical reliability ) , profile_baseline as ( select profile_id , profile_service , user_id , approx_quantiles(case when profile_service = 'instagram' then (coalesce(likes, 0) + coalesce(comments, 0) + coalesce(shares, 0)) / nullif(reach, 0) * 100 else engagement_rate end, 2)[offset(1)] as baseline_engagement , count(*) as baseline_posts from dbt_buffer.publish_updates where sent_at >= timestamp_sub(current_timestamp, interval 365 day) and engagement_rate > 0 and profile_service not in ('pinterest', 'tiktok', 'youtube') group by 1, 2, 3 having count(*) >= 10 ) , weekly_activity as ( select up.user_id , up.profile_id , up.profile_service , date_trunc(date(up.sent_at), week(monday)) as week_start , count(distinct up.id) as posts_in_week , approx_quantiles(case when up.profile_service = 'instagram' then (coalesce(up.likes, 0) + coalesce(up.comments, 0) + coalesce(up.shares, 0)) / nullif(up.reach, 0) * 100 else up.engagement_rate end, 2)[offset(1)] as median_weekly_engagement from dbt_buffer.publish_updates as up inner join qualified_profiles as qp on up.profile_id = qp.profile_id where up.sent_at >= timestamp_sub(current_timestamp, interval 365 day) and up.engagement_rate > 0 and up.profile_service not in ('pinterest', 'tiktok', 'youtube') group by 1, 2, 3, 4 ) -- calculate consistent time windows , time_windows as ( select profile_id , greatest(first_post_date, date_sub(current_date(), interval 365 day)) as analysis_start_date , current_date() as analysis_end_date , date_diff(current_date(), greatest(first_post_date, date_sub(current_date(), interval 365 day)), week) as total_possible_weeks from profile_tenure ) select wa.user_id , wa.profile_id , wa.profile_service , pt.tenure_days , pt.first_post_date , pb.baseline_engagement -- posting frequency metrics , count(distinct wa.week_start) as active_weeks , sum(wa.posts_in_week) as total_posts , avg(wa.posts_in_week) as avg_posts_per_week , stddev(wa.posts_in_week) as posts_per_week_sd -- posting consistency metrics (using consistent time window) , tw.total_possible_weeks , case when tw.total_possible_weeks > 0 then count(distinct wa.week_start) / tw.total_possible_weeks else 0 end as pct_weeks_with_posts -- engagement metrics (using median) , approx_quantiles(wa.median_weekly_engagement, 2)[offset(1)] as median_engagement , approx_quantiles(wa.median_weekly_engagement, 2)[offset(1)] - pb.baseline_engagement as engagement_lift from weekly_activity as wa inner join profile_tenure as pt on wa.profile_id = pt.profile_id inner join profile_baseline as pb on wa.profile_id = pb.profile_id inner join time_windows as tw on wa.profile_id = tw.profile_id where wa.week_start >= date_trunc(tw.analysis_start_date, week(monday)) group by 1, 2, 3, 4, 5, 6, tw.total_possible_weeks having count(distinct wa.week_start) >= 4 -- at least 4 weeks of activity"# get data from BigQueryposts <-bq_query(sql = sql)
Data Tidying
First we’ll remove outliers in the data and filter out profiles with fewer than 4 active weeks and 20 total posts. We’ll also only look at profiles that have been connected to Buffer for at least 6 months.
Code
# function to calculate z-scores and remove outliersremove_outliers <-function(data, columns, threshold =3) {for(col in columns) {if(col %in%names(data)) { z_scores <-abs(scale(data[[col]])[,1]) data <- data[z_scores < threshold, ] } }return(data)}# clean data and handle outliersposts_clean <- posts %>%filter( total_posts >=20, active_weeks >=4, tenure_days >=180, # 6 months!is.na(median_engagement) ) %>%# remove extreme outliers using z-scoresremove_outliers(columns =c("avg_posts_per_week", "median_engagement"), threshold =3)
For each platform and profile, we’ll plot the median engagement rate by the percentage of weeks in which posts were shared. The theory we would like to explore is that profiles that post in a higher percentage of weeks have higher median engagement rates.
Code
# define consistency tiersposts_clean <- posts_clean %>%mutate(consistency_tier =case_when( pct_weeks_with_posts <=0.33~"Low", pct_weeks_with_posts <=0.67~"Medium", pct_weeks_with_posts >0.67~"High",TRUE~"Other" ))# order factor levelsposts_clean$consistency_tier <-factor(posts_clean$consistency_tier, levels =c("Low", "Medium", "High"))# plot weeks with posts vs avg engagementposts_clean %>%mutate(pct_weeks_with_posts =ifelse(pct_weeks_with_posts >1, 1, pct_weeks_with_posts)) %>%group_by(profile_service, consistency_tier =cut(pct_weeks_with_posts, seq(0, 1, 0.2))) %>%summarise(med_engagement =median(median_engagement, na.rm =TRUE)) %>%ggplot(aes(x = consistency_tier, y = med_engagement, fill = profile_service)) +geom_col(show.legend =FALSE) +facet_wrap(~profile_service) +theme(legend.position ="none") +labs(x ="Percent of Weeks With Sent Posts", y =NULL,title ="Median Engagement Rate by Posting Consistency Rate")
The results are mixed. For several of the platforms there does seem to be a positive correlation between the proportion of weeks in which posts were sent and median engagement rate, but it’s only up to a certain point, around 50% of weeks. After a profile posts in at least 50% of eligible weeks the median engagement rates even seem to decline.