How to build 'Chat with IFC 3D Model' App
Deep Dive into AI-Powered Building Information Model 3D Assistant
In the world of architecture, engineering, and construction, Industry Foundation Classes (IFC) files are the backbone of Building Information Modeling (BIM). But what if we could make these complex files more accessible and interactive? Let's explore a project that combines IFC files with Large Language Models (LLMs) to create a chat-like interface for querying building information.
Script credits: Joao Silva : https://www.linkedin.com/in/joaomiguelarch/
Github link containing code : https://github.com/Mistrymm7/chat_with_ifc_llm_workshop
The Process: From IFC to Natural Language Answers
The process can be broken down into four main steps:
Load IFC and convert to SQL
Generate SQL query from a natural language question
Execute the SQL query
Construct a natural language answer
Let's dive into each step and examine some key code snippets.
Step 1: Loading IFC and Converting to SQL
The first step involves loading the IFC file and converting it into a more query-friendly format: SQL. Here's how it's done:
import ifcopenshell
import pandas as pd
import sqlite3
# Load the IFC file
ifc_file = ifcopenshell.open(file_path)
# Filter for specific element types
class_names = ["IfcDoor", "IfcWindow"]
# Convert to CSV format
with pd.ExcelWriter(outuput_csv_name, engine='openpyxl') as writer:
for class_name in class_names:
objects = ifc_file.by_type(class_name)
# ... (code to extract properties and create DataFrame)
result_df.to_excel(writer, sheet_name=class_name, index=False)
# Convert CSV to SQL Database
conn = sqlite3.connect('/content/ifcdatabase.db')
sheets_dict = pd.read_excel(csv_file_path, sheet_name=None)
for sheet_name, df in sheets_dict.items():
# ... (code to create tables and insert data)
df.to_sql(sheet_name, conn, if_exists='append', index=False)
This code snippet shows how the IFC file is loaded, specific elements are extracted, and the data is converted first to CSV and then to a SQLite database.
Step 2: Generating SQL Queries with LLMs
Once we have our data in a SQL database, we can use an LLM to generate SQL queries from natural language questions. Here's the function that does this:
def generate_sql_query(dB_context: str, user_question: str) -> str:
response = client.chat.completions.create(
model=completion_model,
messages=[
{
"role": "system",
"content": f"You are a SQLite expert working with a database that has multiple tables, one for each building element type. ... {dB_context}"
},
{
"role": "user",
"content": f"# User question #\n{user_question}"
},
],
)
return response.choices[0].message.content
This function uses OpenAI's API to generate an SQL query based on the database schema and the user's question.
Step 3: Executing the SQL Query
With the SQL query in hand, we can now execute it against our database:
def execute_sql_query(dB_path, sql_query):
conn = sqlite3.connect(dB_path)
cursor = conn.cursor()
cursor.execute(sql_query)
result = cursor.fetchall()
conn.close()
return result
This function connects to the SQLite database, executes the query, and returns the results.
Step 4: Generating Natural Language Answers
Finally, we use the LLM again to generate a natural language answer based on the SQL query and its results:
def generate_answer(sql_query: str, sql_result: str, user_question: str) -> str:
response = client.chat.completions.create(
model=completion_model,
messages=[
{
"role": "system",
"content": "You have to answer a user question according to the SQL query and its result. ..."
},
{
"role": "user",
"content": f"User question: {user_question}\nSQL Query: {sql_query}\nSQL Result: {sql_result}\nAnswer:"
},
],
)
return response.choices[0].message.content
This function takes the original question, the SQL query, and the query results, and generates a human-readable answer.
Putting It All Together
The entire process is wrapped up in a Gradio interface, allowing users to interact with the system through a simple web interface:
def query_ifc(user_question):
db_schema = get_dB_schema(IFC_SQL_DB)
sql_query = generate_sql_query(db_schema, user_question)
sql_result = execute_sql_query(IFC_SQL_DB, sql_query)
llm_answer = generate_answer(sql_query, sql_result, user_question)
return f"SQL Query: {sql_query}\n\nSQL Result: {sql_result}\n\nAnswer: {llm_answer}"
iface = gr.Interface(
fn=query_ifc,
inputs=gr.Textbox(lines=2, placeholder="Enter your question about the IFC model here..."),
outputs="text",
title="IFC Query System",
description="Ask questions about your IFC model in natural language.",
examples=[
["How many doors are in the project?"],
["What is the total area of all windows?"],
["How many floors does the building have?"],
["What is the height of the tallest wall?"]
]
)
iface.launch()
This code creates a web interface where users can input questions about the IFC model and receive answers in natural language.
Conclusion
By combining IFC files, SQL databases, and Large Language Models, we've created a powerful tool that allows non-technical users to query complex building information models using natural language. This approach has the potential to revolutionize how architects, engineers, and construction professionals interact with BIM data, making it more accessible and user-friendly than ever before.