Ministers are looking into imposing time limits for children using social media platforms, according to Culture Secretary Matt Hancock.
He said the negative impact of posting and consuming content online was a "genuine concern".
The politician said more needed to be done to safeguard young people and suggested a new age-verification system to address concerns.
He said there would a new legal requirement for companies to ensure users were aged over 13 years old. The details of how such a scheme would work are still being hashed out.
"There is a genuine concern about the amount of screen time young people are clocking up and the negative impact it could have on their lives," he told The Times.
"For an adult I wouldn't want to restrict the amount of time you are on a platform but for different ages it might be right to have different time cut-offs."
Mr Hancock suggested varying cut-off times for different ages on platforms such as Facebook, Instagram and Snapchat.
The platforms already specify a minimum age of 13 but children on the sites only have to falsify a date of birth to gain access, despite critics claiming most social networking sites are able to identify underage users from their browsing habits.
Mr Hancock said it was "not beyond the wit of man" to develop an age-verification system for children.
The move to clampdown on children using social media comes amid plans to extend a law requiring pornographic websites to verify users are over 18 years old with their bank cards.
More from Internet safety
The Association of School and College Leaders (ASCL) recently found in a small-scale poll that most school leaders who took part believed children's mental health suffered as a result of social media in the past year.
The Government has also announced it will introduce a new code of practice this year setting out the minimum expectations on social media giants in a bid to make the UK "the safest place in the world to be online".
[contf] [contfnew]
Sky News
[contfnewc] [contfnewc]