+-----------------------------------------+
|  W   W   EEEEE   EEEEE   K   K   EEEEE  |
|  W   W   E       E       K  K    E      |
|  W W W   EEEE    EEEE    KKK     EEEE   |
|  WW WW   E       E       K  K    E      |
|  W   W   EEEEE   EEEEE   K   K   EEEEE  |
+-----------------------------------------+
  

Weeke

TryHackMe Badge
User: Bailey Dunlap
Skills: C++, Python, MalDev
Certifications: 
Bio  : Passionate coder & security enthusiast.
      

Write-ups

TitleCategoryDate
Spawn CMD in memory and execute batch codeExploitationApr 2025
Auto start up using NT FunctionsExploitationApr 2025
NtDll UnHookingTechniqueApr 2025
Windows 10 UAC BypassTechniqueApr 2025

Blog

Code Snippets

Scrapy Form Enumerator
import scrapy
from urllib.parse import urljoin

class FormEnumeratorSpider(scrapy.Spider):
    name = "form_enum"
    allowed_domains = ["example.com"]      # change to your target domain
    start_urls = ["https://example.com/"]  # entry point

    custom_settings = {
        'ROBOTSTXT_OBEY': False,     # ignore robots.txt
        'DOWNLOAD_DELAY': 0.5,       # be polite-ish
        'CONCURRENT_REQUESTS': 8,
        'USER_AGENT': 'Mozilla/5.0 (X11; Linux x86_64) Scrapy/1.8 (+https://scrapy.org)'
    }

    def parse(self, response):
        # Extract and follow all internal links
        for href in response.css('a::attr(href)').getall():
            url = urljoin(response.url, href)
            if self.allowed_domains[0] in url:
                yield scrapy.Request(url, callback=self.parse_forms)

    def parse_forms(self, response):
        # On each page, find 
elements for form in response.css('form'): action = form.attrib.get('action', '') full_action = urljoin(response.url, action) inputs = [] for inp in form.css('input'): name = inp.attrib.get('name') typ = inp.attrib.get('type', 'text') if name: inputs.append({'name': name, 'type': typ}) yield { 'page_url': response.url, 'form_action': full_action, 'method': form.attrib.get('method', 'GET').upper(), 'inputs': inputs }

Projects

Contact

Email: weeke@weeke.xyz

GitHub: github.com/bdunlap9

Twitter: @TheRealWeeke