web scraping - extracting data using jsoup in java -


i trying run code , facing "null pointer exception" in program.i used try , catch donot know how eliminate problem. here code:

import org.jsoup.jsoup; import org.jsoup.nodes.document; import java.net.*; import java.io.*; import java.lang.nullpointerexception; public class wikiscraper  {  public static void main(string[] args) throws ioexception { scrapetopic("/wiki/python"); } public static void scrapetopic(string url){ string html = geturl("http://www.wikipedia.org/"+url); document doc = jsoup.parse(html);      string contenttext = doc.select("#mw-content-text>p").first().text();     system.out.println(contenttext);     system.out.println("the url malformed!"); } public static string geturl(string url){ url urlobj = null; try{ urlobj = new url(url); } catch(malformedurlexception e){ system.out.println("the url malformed!"); return ""; } urlconnection urlcon = null; bufferedreader in = null; string outputtext = ""; try{ urlcon = urlobj.openconnection(); in = new bufferedreader(new inputstreamreader(urlcon.getinputstream())); string line = ""; while((line = in.readline()) != null){ outputtext += line; } in.close(); }catch(ioexception e){ system.out.println("there error connecting url"); return ""; } return outputtext; } } 

the error shown is:

there error connecting url exception in thread "main" java.lang.nullpointerexception     @ hello.wikiscraper.scrapetopic(wikiscraper.java:17)     @ hello.wikiscraper.main(wikiscraper.java:11) 

you have

public static string geturl(string url){     // ...     return ""; } 

what always ends in empty string.

try

document doc = jsoup.connect("http://example.com/").get(); 

for example.


Comments

Popular posts from this blog

android - Get AccessToken using signpost OAuth without opening a browser (Two legged Oauth) -

org.mockito.exceptions.misusing.InvalidUseOfMatchersException: mockito -

google shop client API returns 400 bad request error while adding an item -