Follow Us:

Artificial Intelligence (AI) is no longer the exclusive domain of technology laboratories or Silicon Valley boardrooms. It has permeated the very corridors of governance, taxation and judicial administration. From drafting notices and scrutinising returns to assisting in Audit Planning as well as investigations, AI tools have become increasingly integrated into the functioning of tax authorities and quasi-judicial bodies across India and the world. The promise made is compelling that AI can process vast volumes of records, case law in seconds, identify precedents, summarise judgments and even suggest probable outcomes for decision making process. For overworked adjudicating officers managing hundreds of show cause notices under the Goods and Services Tax (GST) framework, AI appears to offer a welcome relief. Yet this technological promise comes with a serious caveat one that the Honourable Gujarat High Court has now placed firmly on record. When AI tools are used without verification, without independent application of mind and without accountability, they do not merely fail to assist but dangerously they undermine the integrity of the adjudicatory process itself.

(A) The Gujarat High Court case: What Happened?

2. The controversy arose in dispute pertaining to M/S MARHABBA OVERSEAS PRIVATE LIMITED Versus UNION OF INDIA & ORS. under Case Reference. R/Spl C.A. No. 2229/2026 before the Honourable Gujarat High Court. A quasi-judicial officer passed an adjudication order in a dispute under the Goods and Services Tax Act, 2017. The order, on its face, appeared well-reasoned as it cited legal precedents, invoked principles, and reached a firm conclusion. However, there was a fundamental flaw concealed beneath the surface.

3. During the writ proceedings initiated by the aggrieved taxpayer-petitioner, it came to light that the adjudicating officer had relied upon case law citations generated by an Artificial Intelligence tool. Upon examination, these citations were found to be incorrect, non-existent, or entirely irrelevant to the issues in dispute. The phenomenon, widely known in the AI community as ‘AI-Hallucination,’ had made its way into a formal quasi-judicial order.

“AI hallucination” refers to the tendency of generative AI systems to fabricate plausible sounding but factually false information including inventing court citations, judgment names, and legal principles that do not exist in any court record.

The petitioner challenged the order before the High Court, arguing that an adjudication order founded on fabricated and phantom judicial precedents was legally unsustainable. It was further submitted that the adjudicating authority had abdicated its quasi-judicial responsibility by uncritically accepting AI-generated output without independent verification, thereby causing serious prejudice to the petitioner through a fundamental lack of authentic legal reasoning. Notably, the respondents could not defend the accuracy of the citations relied upon, effectively conceding that the adjudication process had been compromised by unchecked reliance on AI tools.

4. The Gujarat High Court issued a strong and unambiguous caution to quasi-judicial authorities while dealing with the current case cited supra. The Court’s key observations can be summarised as follows:

  • Independent Verification is Non-Negotiable: Quasi-judicial authorities must independently verify and apply only authentic and relevant judgments while deciding disputes. Reliance on any tool AI or otherwise does not discharge this obligation.
  • AI as Assistance, Not Authority: The Court affirmed that AI may assist in legal research, but the ultimate responsibility for legal accuracy, reasoning, and application rests solely with the officer passing the order.
  • Blind Dependence is Impermissible: The Court firmly held that blind dependence on AI tools is wholly impermissible in adjudicatory proceedings. An officer who delegates their judicial reasoning to a machine, without verification, is not performing their lawful duty.
  • Orders Based on Non-Existent Citations are Vitiated: An order that derives its reasoning from fabricated or irrelevant citations suffers from a fundamental legal infirmity and cannot stand the test of judicial scrutiny.

5. Consequentially, the Gujarat High Court granted interim relief to the petitioner, staying the operation of the impugned adjudication order until the final disposal of the writ petition. This effectively renders the AI-assisted order unenforceable for the time being a significant and instructive consequence.

(B) How AI can be used for effective Tax administration:-

6. To fully appreciate the honourable Gujarat High Court’s ruling, it is important to understand the landscape of AI adoption in tax administration today. AI tools are being employed at multiple levels in GST ecosystem and beyond as well:

6.1 Legal Research and Case Law Mining

Officers and practitioners as well routinely use AI-powered legal research platforms to find relevant precedents. Tools powered by Large Language Models (LLMs) can be prompted to find judgments on specific GST or Tax related questions, summarise Court orders, or compare divergent High Court rulings. The efficiency gain is substantial tasks that once took hours can now be completed in minutes.

6.2 Drafting of Notices and Orders

Generative AI is increasingly used to draft show cause notices, adjudication orders, and departmental representations. Officers often input facts and direct the AI to produce a structured order, which is then reviewed and approved. The danger arises when the ‘review’ step is perfunctory or absent.

6.3 Scrutiny and Risk Profiling

At a systemic level, Authorities use AI and machine learning models for return scrutiny, anomaly detection, ITC mismatch identification, and taxpayer risk profiling and other Analytical. These applications, being data-driven rather than text-generative, carry a different risk profile from the hallucination problem though they are not completely shielded.

6.4 Responding to Legal Queries

Officers increasingly consult AI chatbots for quick answers on GST provisions, circular clarifications, and tribunal positions. Without verification, such responses if acted upon can lead to precisely the kind of error the Gujarat High Court has now highlighted. Here is the importance of Prompt Engineering to get the desired results.

(C) Where Things go Wrong:-

7. The Gujarat High Court case is not an isolated incident. It represents a systemic risk wherever AI tools are used without appropriate safeguards. The primary modes of misuse include:

7.1 Uncritical Acceptance of AI-Generated Citations

The most dangerous misuse, as demonstrated in the Gujarat case, is treating AI-generated legal citations as verified precedents. Generative AI models, including the most advanced ones, routinely hallucinate and they produce citations that sound credible, complete with party names, court names, and case numbers, but which do not exist. An officer who inserts such citations into an order without verifying them on official databases is building a legal edifice on sand which is bound to collapse in no time.

7.2 Delegation of Judicial Reasoning

Quasi-judicial adjudication requires the application of mind, a legal standard that demands that the officer personally consider the facts, evaluate the evidence, apply the law, and arrive at a reasoned conclusion. When AI is used to generate not just the research but the reasoning itself, this fundamental duty is abdicated. The order becomes the AI’s conclusion stamped with the officer’s signature, rather than the officer’s conclusion informed by AI research.

7.3 Over-reliance on AI Summaries

AI-generated summaries of judgments can be misleading. Context is often lost; nuances in ratio versus obiter are frequently misrepresented; and the distinction between a judgment that supports the department and one that is merely factually similar is often blurred. Officers who rely on AI summaries without reading the original judgment risk fundamental misapplication of precedent.

The principle of audi alteram partem — the right to be heard — and the obligation to provide speaking orders are cornerstones of quasi-judicial fairness. When AI-generated content replaces authentic reasoning, both principles are effectively compromised, and justice become an end game of conciliations.

(D) Precautions for Adjudication Officers:-

8. The Gujarat High Court’s ruling, read purposively, provides the outline of a best-practice framework for the lawful and responsible use of AI in adjudication. Officers must observe the following precautions:

  • Verify Every Citation: No AI-generated citation should be included in an order without first verifying its existence, accuracy, and relevance on an authorised legal database. The officer should read the actual judgment, not just the AI’s summary of it.
  • Use AI for Research, Not Reasoning: AI may be used to identify potentially relevant case law, but the selection, analysis, and application of that case law must be the officer’s own independent work.
  • Document the Verification Process: Officers should maintain a brief internal record of how key citations were verified, providing an audit trail that demonstrates application of mind.
  • Cross-Check AI Outputs: Where possible, AI-generated legal research should be cross-checked against at least one independent source — a published commentary, a tribunal order repository, or a senior officer’s review.
  • Understand the Limitations of the Tool: Officers must receive training on what AI can and cannot do. In particular, they must understand that hallucination is an inherent characteristic of current generative AI systems, not an occasional bug.
  • Institutional Guidelines are Essential: Tax departments and regulatory bodies should issue formal guidelines specifying permissible uses of AI in adjudicatory processes, with clear prohibitions on the unchecked use of AI-generated citations in formal orders.

9. The Gujarat High Court’s caution is not a call to abandon AI it is a call for informed, responsible, and verified use. Used correctly, inter-alia AI has transformative potential in tax administration.

  • Preliminary Case Research
  • Drafting Assistance for Standard Notices
  • Summarisation of Voluminous Records
  • Training and Knowledge Management
  • Taxpayer Communication and Assistance
  • Pattern Recognition and Audit Selection

10. The Gujarat High Court’s interim order in R/Spl C.A. No. 2229/2026 carries significance well beyond its immediate facts. It is, in effect, a constitutional reminder that the rule of law cannot be outsourced — whether to technology, subordinates, or institutional inertia. Several broader implications deserve attention to formulate formal AI usage policies for adjudicating officers. The absence of such guidelines currently creates both legal vulnerability and operational inconsistency. The principle is universal that AI-generated content in formal orders must be verified before reliance. From a jurisprudential standpoint, the case raises important questions about what constitutes a ‘speaking order’ under administrative law when the reasoning is machine-generated. Future litigation may scrutinise not merely the accuracy of citations, but the authenticity of the reasoning process itself. It is said that

(E) Before bidding adieu……

11. The Gujarat High Court’s warning on AI hallucinations in GST adjudication is timely, necessary, and instructive. It does not condemn technology; it contextualises it. AI is a powerful instrument, but an instrument nonetheless and like all instruments, its value depends entirely on the skill, care, and judgment of the person wielding it. For adjudicating officers under the GST framework, the lesson is clear. AI may be used to research, to draft, to retrieve, and to assist but the final order must be the product of the officer’s own verified understanding, independent reasoning, and conscientious application of the law. Every citation must be real. Every precedent must be relevant. Every conclusion must be the officer’s own. The Gujarat High Court has placed a judicial marker. It has emphasised the fact that

Technology must complement and never replace the independent application of mind that lies at the heart of every fair decision in the interest of justice.

Jai hind!!!!!

Author Bio

The Author one of the very few officers in the department to win all the three highest prestigious awards at Zonal and National levels. He has been awarded the “SAMAAN -Best Officer Award” in 1999 at Chennai Central Excise Zonal level, Recipient of the esteemed “CBEC - Chairman’s Commendatio View Full Profile

My Published Posts

One Notice, Many Years: The GST Time Trap Fraud-Based GST Proceedings Against Statutory Bodies: A Critical Analysis Primacy of Act: Analyzing Telangana HC’s Landmark Ruling on ISD Credit Distribution Landmark Multilayered Reform on Tobacco Taxation – Historically Highest Tax Levy on Demerit Goods Central Excise Act 2025: Reshapes Tobacco Taxation Landscape View More Published Posts

Join Taxguru’s Network for Latest updates on Income Tax, GST, Company Law, Corporate Laws and other related subjects.

3 Comments

  1. N Manimaran says:

    A well detailed analysis on the application of AI in Adjudication process. It’s good to have A single point of source wherein all the rulings of Supreme Court, High Courts, Tribunals are available.

Cancel reply

Leave a Comment to M. Selvakumar

Your email address will not be published. Required fields are marked *

Ads Free tax News and Updates
Search Post by Date
May 2026
M T W T F S S
 123
45678910
11121314151617
18192021222324
25262728293031