As of 1st June 2026, GitHub Copilot will cost its customers on the idea of the tokens they use, slightly than a flat charge subscription mannequin.
The mannequin that’s seeing the shutters closed on it’s, or slightly was, easy to grasp and use. Customers got a set variety of ‘Premium Requests’ in accordance with their subscription tier. A fancy coding job that will have taken many hours to finish used one premium request. Posing a comparatively trivial query additionally counted as a single premium request.
Nonetheless, the change which is quickly to have an effect on GitHub Copilot customers aligns the pricing fashions with these of API prices to giant language fashions, extra widespread amongst enterprise plans. On the brand new GitHub Copilot pricing scheme, most requests can be measured in accordance with the tokens utilized by, enter to, and output from the LLM on the coronary heart of Copilot.
The definition and price of tokens
A token is commonly described as representing round three-quarters of a phrase. Thus, giving an LLM a textual content of 10,000 phrases to look at would equate to 12,000-13,000 tokens of content material. In developer phrases, if a physique of code which Copilot have been to look at (for refactoring or bug-hunting for instance), comprised of 10,000 ‘phrases’ (expressions, statements, variable names, capabilities, and so forth), then that utilizing it in a single question, as soon as, would rely as 12,000-13,000 tokens out of their allottment for the month.
Immediate textual content, as inputs, may also rely, as will the outputs from Copilot.
The pricing tiers coming into impact subsequent month stay pegged at their present ranges, however as a substitute of being allotted quite a few queries monthly, customers are given ‘AI Credit’ to the identical worth. A base-tier Copilot Professional subscriber ($10pcm) will obtain 1,000 credit, with GitHub saying that at current one AI Credit score is price one US cent.
The variety of tokens every credit score buys will rely on the mannequin used, the enter/output combine, the dimensions of the cache (information held within the LLM’s reminiscence for context), and have requested. Thus, if a developer makes use of principally easy queries, they’re probably to not have to purchase further tokens within the type of credit every month. Conversely, multi-agent queries a few advanced, prolonged code base will empty the AI Credit score account extra rapidly. Queries to the most-advanced frontier fashions will price greater than to the less-powerful.
GitHub’s pricing modifications do embody some compensatory advantages for customers: Code completions (just like a telephone’s auto-complete perform) and Subsequent Edit solutions will stay free.
The {industry} modifications to per-token pricing
The modifications to GitHub’s pricing mannequin are in step with comparable modifications from different corporations. Anthropic and OpenAI have now moved their enterprise prospects to token-based billing. Not like these two, nonetheless, Microsoft – proprietor of GitHub – is a worthwhile enterprise general, and has so far been in a position to subsidise using GitHub Copilot with revenues from different elements of the enterprise, resembling its software program and cloud divisions.
Up till the change on 1st June, customers could have been in a position to ‘spend’ between three and eight instances the variety of tokens their month-to-month subscription prices have coated, and incurred no penalty.
Microsoft’s transfer is a change that impacts these it hoped to draw to Copilot’s options, instantly forcing new and present customers to change into conscious of their token spend per question – a determine that has been abstracted away by per-month subscriptions so far. The brand new billing mannequin might make extra financial sense from Microsoft’s viewpoint, however it discourages the exploration and testing that new customers will need to do.
For companies that deploy AI coding brokers of their growth groups, the fee implications of the industry-wide shift in pricing insurance policies are vital. Within the case of Uber, as an example per The Information [paywall], its CTO has stated it had spent the 12 months’s AI funds for 2026 already this 12 months, declaring that 11% of updates to Uber’s code are actually written by AI. Uber primarily makes use of Anthropic’s Claude coding brokers.
Outdoors the IT division, corporations deploying AI automation needs to be conscious that advanced duties, which can contain working agentic LLMs unsupervised for lengthy durations, might quickly be charged on the same per-token foundation. Thus, the delivered effectivity beneficial properties from AI within the workforce must be measured towards any rise in AI distributors’ payments.
(Picture supply: Pixabay below licence.)
Need to study extra about AI and large information from {industry} leaders? Take a look at AI & Big Data Expo going down in Amsterdam, California, and London. The excellent occasion is a part of TechEx and co-located with different main know-how occasions. Click on here for extra info.
AI Information is powered by TechForge Media. Discover different upcoming enterprise know-how occasions and webinars here.

